Paris 2024 Olympic Games: the CNCDH warns about algorithmic video surveillance less than a month before the Games

Paris 2024 Olympic Games: the CNCDH warns about algorithmic video surveillance less than a month before the Games

Olympic Games Paris 2024 MAXPPP – Alexis Sciard

La Commission nationale consultative des droits de l’homme pointe les "risques pour les libertés" causés par la vidéosurveillance algorithmique (VSA), dans un avis publié au Journal officiel mardi, à moins d’un mois des Jeux olympiques de Paris.

The VSA, tested in France as part of the Olympic Games (July 26 & August 11), consists of software associated with surveillance cameras to identify judged events suspects or at risk, before alerting an operator in real time.

The CNCDH issues in particular "two observations" regarding "l& rsquo;impact of the VSA on fundamental rights and freedoms", in a long opinion on "the surveillance of public space& quot;.

Among the eight "predetermined events", defined by the "JO law" from April 2023, which the VSA must detect, it is planned to associate"an alert to an abnormal direction of pedestrian traffic", a person walking against the direction in the crowd for example.

This provision represents for the commission an excessively standardized conception of public order" and "thus exposes the system to inappropriate reports which can lead to undue arrests".

The commission is also concerned about the unprecedented involvement of private actors

The CNCDH further considers that "it will be difficult to ensure control in practice" guarantees provided for by law, in particular in the absence of sufficient resources allocated to the National Commission for Informatics and Liberties (CNIL) and fears that citizens will develop "a feeling of increased surveillance"during the Games.

The commission is also concerned about the unprecedented involvement of private actors – software designers – in the exercise of a sovereign mission" and the possible "discriminations" induced.

The VSA is based on software designed to "identify worrying situations", "the machine could therefore be led to associate a certain level of risk with certain recurring characteristics" such as "wearing a hood", warns the Commission.

In a situation, these software programs are configured to focus "for example, on the type of vehicle to be detected" but, the commission fears, "an agent in charge of its use within the urban supervision center could indirectly target certain categories of the population traveling on the public highway".

If "checking the absence of bias is currently akin to wishful thinking", deplores the CNCDH, nevertheless “it would be possible to bring to light the discriminations produced through its use” and, therefore, “take all measures to remedy them”.

Add a Comment

Your email address will not be published. Required fields are marked *

(function(d,s){d.getElementById("licnt2061").src= "https://counter.yadro.ru/hit?t44.6;r"+escape(d.referrer)+ ((typeof(s)=="undefined")?"":";s"+s.width+"*"+s.height+"*"+ (s.colorDepth?s.colorDepth:s.pixelDepth))+";u"+escape(d.URL)+ ";h"+escape(d.title.substring(0,150))+";"+Math.random()}) (document,screen)