Facebook refutes allegations of manipulating ‘Trending Topics’
Washington, May 13 (IANS) Amid allegations that a team of editors at Facebook is manipulating the popular “Trending Topics” by “injecting” selected articles to promote a particular perspective, the social media giant has reiterated that it does not allow or advise its reviewers to discriminate against sources of any political origin.
A report in technology website Gizmodo had accused Facebook of an editorial bias against conservative news organisations which led to a call for a congressional inquiry from Senator John Thune (Rep) from South Dakota and chair of US Senate commerce committee which has jurisdiction over media issues.
The panel has also sent a letter to Facebook CEO Mark Zuckerberg asking for answers related to the “Trending Topics” row.
“Yes. We take these reports very seriously and will continue to investigate the allegations. We have found no evidence to date that ‘Trending Topics’ was successfully manipulated but will continue the review of all our practices,” posted Justin Osofsky, vice president, global operations, Facebook, on Thursday.
“The guidelines do not permit the suppression of political perspectives. About 40 percent of the topics in the queue get rejected by the reviewers because they reflect what is considered “noise” – a random word or name that lots of people are using in lots of different ways,” he said, adding that this tool is not used to suppress or remove articles or topics from a particular perspective.
“Trending Topics” was launched in 2014 to surface major conversations happening on Facebook.
It appears on right-hand side on desktop as well as when you tap on the search box in the mobile app and primarily for people using Facebook in English (there are limited tests being run in Spanish and Portuguese).
“At its core, ‘Trending Topics’ is designed to help people discover major events and meaningful conversations,” Osofsky posted.
According to him, “Trending Topics” team is governed by a set of guidelines meant to ensure a high-quality product, consistent with Facebook’s deep commitment to being a platform for people of all viewpoints.
“The guidelines demonstrate that we have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum. Facebook does not allow or advise our reviewers to discriminate against sources of any political origin, period,” Osofsky stressed.
Meanwhile, the Guardian reported that it has access to leaked documents that show “how Facebook relies on old-fashioned news values on top of its algorithms to determine what the hottest stories will be for the one billion people who visit the social network every day”.
The documents show that Facebook relies on the intervention of a small editorial team to determine what makes its “trending module” headlines, the report added.
In a reply to this, Osofsky posted: “Potential ‘Trending Topics’ are first surfaced by an algorithm that identifies topics that have recently spiked in popularity on Facebook”.
“The ‘Trending Topics’ algorithm also uses an external RSS website crawler to identify breaking events so that we can connect people to conversations on Facebook about newsworthy events as quickly as possible,” he noted.
“We have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum, as well as to eliminate noise that does not relate to a current newsworthy event but might otherwise be surfaced through our algorithm,” Osofsky clarified.
The list of “Trending Topics” is then personalised for each user via an algorithm that relies on a number of factors, including the importance of the topic, Pages a person has liked, location, feedback provided by the user about previous Trending Topics and what’s trending across Facebook overall.
“Trending is also integrated into Facebook Search so you can search for any Trending topic that may not show up in your Trending suggestions,” the post read.