![]() ![]() In February 2021, the Swedish data protection authority (IMY), decided that a Swedish local police’s use of Clearview’s technology involved unlawfully processed biometric data for facial recognition. This was later confirmed by the Belgian police watchdog ruling that stated its use was unlawful. Albeit admitting the use, the Ministry of Interior also confirmed and emphasized that Belgian law does not allow this. This was derived from a trial period the company provided to the Europol Task Force on Victim Identification. While the use of this software has never been legal in Belgium and after denying its deployment, the Ministry of the Interior confirmed in October 2021 that the Belgian Federal Police used the services of Clearview AI. However, Clearview AI has only deleted Matthias Marx’s data and the DPA’s case is not yet closed. As a result, the Hamburg DPA deemed Clearview AI’s biometric photo database illegal in the EU. The decision includes an order to erase the data relating to individuals in Italy and banned any further collection and processing of the data through the company’s facial recognition system.įollowing an individual complaint from Reclaim Your Face activist Matthias Marx, the Hamburg Data Protection Agency ordered Clearview to delete the mathematical hash representing a user’s profile. Here are some of the Data Protection Authorities and watchdogs’ decisions:įollowing Reclaim Your Face Partner’s Privacy International and individual complaints about Clearview AI’s facial recognition software, the French data protection authority (‘CNIL’) d ecided in December 2021 that Clearview AI should cease collecting and using data from data subjects in France.Īfter individuals (including Riccardo Coluccini) and Reclaim Your Face organisations ( among them Hermes Centre for Transparency and Digital Human Rights and Privacy Network) filed a complaint against Clearview AI, Italian’s data privacy watchdog (Garante per la Protezione dei dati personali) fined Clearview AI the highest amount possible: 20 million Euros. The complaints were submitted to data protection regulators in France, Austria, Italy, Greece and the United Kingdom. In May 2021, a coalition of organisations (including noyb, Privacy International (PI), Hermes Center and Homo Digitalis) filed a series of submissions against Clearview AI, Inc. In several EU countries many activists, Data Protection Authorities and watchdogs took action. Reclaim Your Face partners and other organisations have taken several actions to limit Clearview AI in France, Italy, Germany, Belgium, Sweden, the United Kingdom, Australia and Canada. This will go on, as long as we don’t put a stop to Clearview AI and its peers. Now, the company uses an algorithm that matches a given face to all the faces in its 10B database: (virtually) everyone and anyone.Ĭreepily enough, this database can be available to any company, law enforcement agency and government that can pay for access. Following the gathering and storage of data through online scraping, Clearview AI managed to create a database of 10 BILLION images. The company scrapes the pictures of our faces from the entire internet – including social media applications – and stores them on its servers. The specific way Clearview AI gathers its data enables biometric mass surveillance, being a practice also adopted by actors such as PimsEyes among others. How did Clearview AI build a database of 10 billion images?Ĭlearview AI gathers data automatically, through a process called ( social media ) online scraping.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |