Stuck by the algorithm: social media bias worries
Do you feel like you always see the same content on social networks? Normal: they are pushed by algorithms, codes that prioritize what these platforms offer according to the user, at the risk of locking the latter in a bubble.
The Assises du journalisme de Tours devoted a conference on Thursday to these “information bubbles”, “unique menus” imposed on us by the algorithms of YouTube, Facebook or Twitter.
A major challenge for the future, since young people are getting more and more information via social networks. According to the Kantar – La Croix barometer published in January, these platforms are the second source of information for French people aged 18 to 24, behind television news.
Globally, while Facebook remains the most widely used social network globally for getting information, young people are turning massively to image-based apps, like TikTok and Instagram, according to the Reuters Institute's 2022 report.
All these networks operate according to their own algorithm, a kind of computer recipe that defines what each user sees according to their profile and browsing history.
“Each time we like, retweet, comment on content, we signal to the machine: + that's interesting, that makes me react, that makes me stay connected +”, explained Mathilde Saliou, journalist of the specialized media Next INpact.
Subsequently, it is this type of content that is more easily offered to the user.
- Don't miss not The meeting of the hour Durocher-Dutrizac, every day at 1:00 p.m., live or on podcast on QUB radio:
Because “it's like that's what social networking platforms are built,” continued Mathilde Saliou. Their “most frequent business model” is “to earn money by showing advertising”, which implies that the user “stays connected as long as possible”.
What to lock it in a “filter bubble”, a concept created by the American Eli Pariser in 2011?
The real existence of these bubbles “does not make consensus”, tempered Mathilde Saliou. According to her, it's often an “effect that you can get out of” by taking the initiative to go and see other content.
On the other hand, “there are concerns about of certain specific effects” that can lead the user into “spirals of radicalization”.
Thus, starting to watch conspiratorial videos on YouTube claiming, for example, that the Earth is flat exposes you to being then bombarded with similar videos.
In a lighter mode, Xavier Eutrope, journalist at the media review of INA, recounted the case of a friend who found herself overwhelmed by videos of “Turkish masons” on TikTok, without understanding the reason.
Additional reason for concern: the content most likely to keep the user on the platform is often the most divisive and controversial.
“Emotions sell, create commitment, simple click like, comment and share”, “especially” when they are “negative”, noted Cyrille Frank, of the digital consulting agency CosaVostra.
These “negativity algorithms” are a “danger”, in particular for young people, who “see the news on social networks”, was alarmed David Medioni, of the Jean-Jaurès Foundation, during another debate at the Assizes.
< p>With the academic Guenaëlle Gault, he has just released an essay entitled “Quand l'info épuise”, devoted to the “information fatigue” from which many French people say they suffer when faced with an overflow of information.
< p>Finally, all observers point to the opacity of the algorithms, the mode of operation of which is a jealously guarded secret by the platforms.
The DSA, a new European regulation on digital services which will soon apply, provides that the States have access to the algorithms of the major platforms. But we do not know to what extent they will agree to comply.
A few weeks ago, the new owner of Twitter, the controversial billionaire Elon Musk, assured that he would make public the network algorithm. A promise that has remained a dead letter until now.
For Cyrille Frank, “it is desirable for the citizen to understand what sauce he will be eaten” on social networks. After which, “everyone makes their own choices”.