“Home”, “family”, “children”… be careful of the words you use with AI, UNESCO warns of gender bias

“Home”, “family”, “children”... be careful of the words you use with AI, UNESCO warns of gender bias

Les grands modèles de langage de Meta et d'OpenAI véhiculent des préjugés sexistes. MAXPPP – Vincent VOEGTLIN

Une étude dévoilée ce jeudi 7 mars 2024 par l'Unesco met en garde contre les travers de l'intelligence artificielle.

In a study published this Thursday, March 7, 2024, Unesco warns against gender bias in artificial intelligence, on the eve of International Women's Rights Day.< /p>

OpenAI's GPT 2 and GPT 3.5 models, the latter being at the heart of the free version of ChatGPT, as well as Llama 2 from competitor Meta, demonstrate "unequivocally prejudiced against women", warns the UN body. "The discriminations of the real world are not only reflected in the digital sphere, they are also amplified", underlines Tawfik Jelassi, Assistant Director General of Unesco for Communication and Information, indicates France 24.

"Home","family", "children"

According to the study, conducted from August to March 2024, these interfaces tend to associate feminine nouns with words such as "house", & ;quot;family" or "children". While masculine nouns are more associated with the words "commerce", "salary" or "career".

Furthermore, the researchers asked these interfaces to produce stories about people of different origins and genders. The results showed that stories about "people of different cultures minorities or women were often more repetitive and based on stereotypes".

Thus, an English man will be presented as a teacher, a driver or even a bank employee. While an English woman will be presented in at least 30% of generated texts as a prostitute, model or waitress.

These companies "do not manage to represent all their users", deplores Leona Verdadero, specialist in digital policies and digital transformation at Unesco.

More women

To combat these prejudices, Unesco recommends that companies in the sector have more diverse teams of engineers, with more women. In fact, it only represents 22% of the members of teams working in artificial intelligence globally, recalls Unesco in Le Figaro.

The UN body also calls on governments for more regulation to put in place an "iethical artificial intelligence".

Add a Comment

Your email address will not be published. Required fields are marked *

(function(d,s){d.getElementById("licnt2061").src= "https://counter.yadro.ru/hit?t44.6;r"+escape(d.referrer)+ ((typeof(s)=="undefined")?"":";s"+s.width+"*"+s.height+"*"+ (s.colorDepth?s.colorDepth:s.pixelDepth))+";u"+escape(d.URL)+ ";h"+escape(d.title.substring(0,150))+";"+Math.random()}) (document,screen)