Facebook apologized to misinformation researchers for providing them with flawed, incomplete data for their work examining how users interact with posts and links on its platform, the New York Times reported. Contrary to what the company told the researchers, the data Facebook provided apparently only included information for roughly half of its users in the US, not all of them.
The Times reported that members of Facebook’s Open Research and Transparency team held a call with researchers on Friday to apologize for the error. Some of the researchers questioned whether the mistake was intentional to sabotage the research, or simply an instance of negligence.
The flaw in the data was first discovered by a researcher at Italy’s University of Urbino, who compared a report Facebook released publicly in August to the data it had provided solely to the researchers. The data sets didn’t match up, according to the Times.
Facebook didn’t immediately reply to a request for comment from The Verge on Saturday, but a spokesperson told the Times that the mistake was the result of a technical error and the company “proactively told impacted partners about and are working swiftly to resolve” the problem.
The report from August 18th that the University of Urbino researcher used in his comparison was released in the interest of “transparency,” showing the most-viewed content in Facebook’s public News Feed between April and June of this year, its second quarter. However, the Times discovered that Facebook had shelved a report about its first quarter that portrayed the company in a much-less flattering light. Facebook eventually released the shelved report.
Also in August, Facebook banned academic researchers from New York University’s Ad Observatory project from its platform, after the group’s Ad Observer browser plug-in highlighted problems. Its research found Facebook had failed to disclose who paid for some political ads on its site.