Europe bans biometric surveillance

0
63


It’s your body data mine. From the way you think and feel, companies working in the growing biometrics industry are developing new and alarming ways to track everything we do. And, in many cases, you may not even know they are following you.

But the biometrics business is colliding with leading European data protection experts. Both the European Data Protection Supervisor, which acts as an independent EU data body, and the European Data Protection Board, which helps countries consistently apply the GDPR, have called for a total ban on the use of AI to automatically identify people.

“The introduction of remote biometric identification in publicly available spaces means the end of anonymity in these places,” wrote the heads of the two bodies, Andrea Jelinek and Wojciech Wiewiórowski. joint statement at the end of June. AI should not be used in public spaces for face recognition, gait recognition, fingerprints, DNA, voice, keystrokes and other types of biometrics, they said. There should also be a ban on attempts to predict the ethnicity, gender, and political or sexual orientation of people using AI.

But such calls face EU AI rules. The rules that were presented in April, say that “remote biometric identification” is high risk – meaning they are allowed to, but face stricter controls than other AI uses. Politicians across the EU will spend years discussing intelligence rules, and biometric surveillance has already become one of the most contentious issues. Once adopted, the regulations will define how hundreds of millions of people will be under surveillance in the coming decades. And the debate is starting now.

Face recognition has been controversial for years, but true biometric flourishing targets other parts of your body. Across the 27 EU Member States, a number of companies are developing and applying biometric technologies that in some cases aim to predict people’s gender and ethnicity and recognize their emotions. In many cases, the technology is already in use in the real world. However, the use of AI to construct these classifications may be scientifically and ethically questionable. Such technologies risk violating people’s privacy or automatically discriminating against people.

Take Herta Security and VisionLabs for example. Both companies are developing face recognition technology for a variety of purposes and say it could be applied by law enforcement, retail and transportation agencies. Herta Security documents based in Barcelona, claim to his clients include police forces in Germany, Spain, Uruguay, Colombia, as well as airports, casinos, sports stadiums, shopping malls and hotel chains such as the Marriott and Holiday Inn.

Critics point out that both Herta Security and VisionLabs claim that parts of their systems can be used to track sensitive attributes. “Many systems, even those used to identify people, rely on these potentially very harmful classifications and categorizations as basic logic,” says Ella Jakubowska, biometrics policy advisor at the European Digital Rights advocacy group. The group is campaigning to ban biometric surveillance across Europe.

BioMarketing, Herta Security’s facial analysis tool, is charged as a way for stores and advertisers to learn about their customers and can “pull out” everything from a person’s age and gender to whether they wear glasses and even track facial expressions. Herta Security says the technology is “ideal” for developing targeted advertising or helping companies understand who their customers are. The tool, Herta Security claims, can also classify people according to “nationality”. According to the GDPR, personal data revealing “racial or ethnic origin” is considered sensitive, with strict control established over how it can be used.

Jakubowska says she challenged the CEO of Herte Security last year for using ethnicity and that the company has since removed the lawsuit from its marketing material. It remains unclear whether the feature has been removed from the tool itself. Company documents the host third parties they still cite ethnicity as one of the characteristics that can be found using BioMarketing. Company documents since 2013 instructed to discover the “races” before updating them to ethnicity. Herta Security, which received more than 500,000 euros in EU funding and received the EU seal of excellence, did not respond to requests for comment.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here