CNIL publishes a roadmap on facial recognition devices

[ad_1]

After strongly opposing face recognition devices in two high schools in the South of France, the National Commission for Computing and Liberties (CNIL) published an explanatory note on this technology on November 15, 2019. Its purpose is posted in the first lines of the document: "present the technical, legal and ethical elements that must be taken into account in approaching this complex issue".

The gendarme of French privacy is convinced: behind the technical aspects of facial recognition, there is a real questioning on "political choices"to do and who are going"draw some contours of the world of tomorrow".

A definition sometimes blurred

The CNIL regrets that the various facets of technology are often confused, and considers that these amalgams prevent the holding of an informed debate. A good opportunity to expose one's own definition: "a computer and probabilistic technique that automatically recognizes a person on the basis of his face, to authenticate or identify himProbabilistic because of the comparison is deduced a probability, more or less strong, that the person is the one that one seeks to recognize.Furthermore, the facial recognition can pursue two objectives: to authenticate a person to verify that it is about her or identifying a person, that is to say, find her in a mass of individuals.

The Privacy Constable reiterates that facial recognition technologies are not the same as video recording devices, either video surveillance cameras or surveillance cameras that only allow people to be filmed. Indeed, these last do not make it possible to automatically recognize individuals. But this is not why we must omit their potential impacts. On the contrary, "The debate must take into account this technological continuumIndeed, the possibility of combining these different devices together (applying facial recognition software to existing cameras) has the effect of "a multiplication of their impact on people".

Moreover, La Cnil is worried about the lack ofcontact"in some facial recognition devices, which completely make the machine disappear from the visual field of the user."At a time when the 'seamless' technologies are highlighted, it must be remembered that certain frictions are necessary"According to the authority, they allow"reminders to reality"necessary for people to assert their rights.

Check on a case by case basis

The administrative authority mentions several French or European uses of facial recognition: the automatic recognition of persons present on an image (for example on a social network), access to services, monitoring the journey of a passenger, the search of the civil status of a person and identification on the street of people wanted. According to her, the reasoning must be case-by-case use when it comes to deciding whether the use of this tool pursues a legitimate objective or not. "While there may be legitimate or legal cases of use of recognition, they should not lead to the belief that everything would be desirable or possible"This positioning is logical because biometric information is sensitive data and is related to the privacy of individuals as well as health data or religious beliefs.

Unlike many reports on this issue, CNIL also devotes part of its report to the question of cost. "It most often affects local authorities or public authorities without the return on investment always being measured methodically and accurately."In a context of rationalization of public expenditure, this question can not be evaded.For the Cnil, it is necessary that the decisions taken in this regard imply"the allocation of new resources or the reallocation of resources allocated to other devices".

Three requirements to respect

Finally, aware of the increasing willingness of public authorities to conduct tests with this technology, the CNIL poses three requirements to be met to Review a facial recognition device. The first concerns the standards to be respected because "borders pre-exist at the origin"She is calling for the development of an experimental framework to setred lines beyond which no use, even experimental, can be admitted".

Secondly, the devices must necessarily "place respect for people at the heart of the process", ie consent, transparency and security.And finally,"to adopt a sincerely experimental approach"which implies a limitation in time and space.The authority states that this caution is not intended to"clamp down on technological innovation" but of "Review and perfect technical solutions that respect the legal framework".

[ad_2]