The European Parliament has passed the first draft of the AI Act, paving the way for a ban on police use of live facial recognition technology in public places.
The Act sets regulations according to four risk levels, with certain applications, including biometric surveillance, emotion recognition and predictive policing, banned outright.
Generative AI systems such as ChatGPT will be required to disclose that content is AI-generated, and AI systems used to influence voters in elections will be considered to be high-risk. Also considered high-risk are the recommender systems used by social media platforms.
“The AI Act will set the tone worldwide in the development and governance of artificial intelligence, ensuring that this technology, set to radically transform our societies through the massive benefits it can offer, evolves and is used in accordance with the European values of democracy, fundamental rights, and the rule of law,” says co-rapporteur Dragos Tudorache.
The only exception to the use of remote biometric identification is where it’s used after the event to help prosecute serious crimes, and only then with judicial authorization.
Biometric categorization systems using sensitive characteristics such as gender, race, ethnicity, citizenship status, religion or political orientation are also banned, as are predictive policing systems based on profiling, location or past criminal behavior.
And also out are the use of emotion recognition systems in law enforcement, border management, the workplace and educational institutions, along with untargeted scraping of facial images from the internet or CCTV footage to create facial recognition databases.
As for the tricky area of foundation models and generative AI, there’s a requirement for providers to assess and mitigate possible risks, and to register their models with an EU database before their release on the EU market. Generative AI systems based on such models, like ChatGPT, will have to disclose that content has been AI-generated, and make detailed summaries of the copyrighted data used for their training publicly available.
The draft has been broadly welcomed by privacy campaigners.
“The fact that the European Parliament is pushing for a ban on real-time face surveillance in public spaces is a historic success for the civil rights movement and a clear vote against a dystopian future of Chinese-style biometric mass surveillance in Europe,” says Pirate Party MEP Patrick Breyer.
However, he warns, “The ‘exceptions’ demanded by EU governments and the Commission would effectively remove the ban, as there are always many people who are wanted by judicial warrant.”
Meanwhile, Access Now warns that the act still has serious shortcomings. These include allowing a level of self-assessment when it comes to high-risk classification. It’s also particularly concerned about the use of automated risk-assessment in migration procedures, and of predictive analytics systems used to curtail migration movement.
With the fusion of AI and cryptocurrencies, ACTS Token represents the next generation of digital assets, providing individuals with cutting-edge solutions powered by artificial intelligence.
By investing in ACTS Token, you have access to a dedicated team and curated resources to help you stay on the front edge of knowledge and opportunities for your financial future and success.
“In this historic AI Act vote, the European Parliament called for a society free from mass surveillance. However, it has drawn a thick line between the haves and the have nots–the lack of bans for AI systems used in migration confirms the EU does not seek to protect fundamental rights when migrant people are the rights-holders,” says Caterina Rodelli, EU policy analyst at Access Now.
“Without prohibitions in the migration context, the EU is sacrificing the rights of people on the move and will deliberately put marginalized communities at risk.”
Read More The European Parliament has passed the first draft of the AI Act, paving the way for a ban on police use of live facial recognition technology in public places.