SAN FRANCISCO | Facebook added a tool to his ring fight against the misinformation : the network will now be displayed in priority items supported, based on first-hand information and written by journalists.
When the various articles will be published on the same news, the algorithm repèrera the one who “is most often cited as being the source of the information” and will appear at the top, announced on Tuesday that the giant of social networks, which attempts to purge its platform of various content issues.
Facebook is attack, in this case, the spread of articles and videos that do not aim to inform, but to deceive or to trick the users, for political or financial.
Often presented in a sensationalistic, to generate ” views “, “clicks” and shares, they may have been produced by the firm of content and can be based on the reports made by the organs of the press have invested resources to find the information.
Facebook says it wants to give priority to ” original covers of actuality “, which ” play an important role to inform the people in the world, the revelation of a new in-depth investigations, through the discovery of new facts and data, the communication of the latest information in times of crisis, and the dissemination of testimonials “.
The california firm going through a difficult period, which has been accused of laxity on the political misinformation, and the content falling under the scope of incitement to hatred.
The accusations are not new, but in the context of protests against systemic racism in the United States, associations have called on the brands to hit them where it hurts: advertising revenue.
Nearly 200 brands, including Coca-Cola, Levis, Unilever and Starbucks, boycott now Facebook for the entire month of July, and even beyond, and ask the company to revise its copy on this subject.
The social network will also “demote” articles that are unsigned or whose publications are not transparent about the identity of their journalists.
“We have established that the publications that do not provide this kind of information often lack credibility and produce content just to get clicks “, justifies the platform to the $ 1.73 billion daily users.
The announcement has elicited positive reactions from observers, though often tongue-in-cheek (” Funny that all of this has not been a priority before, ” remarks on Twitter Gavin Sheridan, a former Storyful).
Others point to the shadow areas of the approach, as the challenge for the artificial intelligence to separate the wheat from the chaff.
A magazine known as The Economist, for example, has taken the decision not to sign the articles, while other publications use pseudonyms, are not a guarantee of reliability.
It is however not a major overhaul of the news feed. The modification of the algorithm concerns only the new, and the social network clarified that the personal choices of users would continue to take precedence.
“Most of the information that people see on their “news feed” come from sources they follow, or from sources that their friends follow, and it’s not going to change.”
Some newspapers may be an advantage, but in low proportions, according to the network.
In 2018, Facebook had launched a major reorganization of its news feed, which, since then, focuses on the publications shared by family and friends, to the detriment of the sources of information.
But for a substantial portion of the users, the platform has replaced the television and other media as a filter of access to information.
According to a study by the Pew Research Center conducted in 2019, 55 % of american adults consult the news “often” or “sometimes” via social networks.
The awareness of this issue, and the responsibility of a juggernaut as Facebook’s arrival in 2018, when broke out the scandals on elections election 2016, marked by a disinformation campaign of large-scale, controlled from abroad.
At the approach of the u.s. presidential election, the giant california has deployed an arsenal of measures, cyber security, the strengthening of the rules of moderation, to avoid a new disaster scenario.