EU to delay 'high risk' AI rules until 2027 after Big Tech pushback
Published by Global Banking and Finance Review
Posted on November 19, 2025
2 min readLast updated: January 20, 2026
Published by Global Banking and Finance Review
Posted on November 19, 2025
2 min readLast updated: January 20, 2026
The EU delays 'high risk' AI rules to 2027 after Big Tech pushback, aiming to streamline regulations and enhance competitiveness.
By Supantha Mukherjee and Bart H. Meijer
BRUSSELS/STOCKHOLM (Reuters) -The European Commission proposed on Wednesday streamlining and easing a slew of tech regulations, including delaying some provisions of its AI Act, in an attempt to cut red tape, head off criticism from Big Tech and boost Europe's competitiveness.
The move by the EU comes after it watered down some environmental laws after blowback from business and the U.S. government. Europe's tech rules have faced similar opposition, though the Commission has said the rules will remain robust.
"Simplification is not deregulation. Simplification means that we are taking a critical look at our regulatory landscape," a Commission official said during a briefing.
'HIGH RISK' AI USE IN JOB APPLICATIONS, BIOMETRICS
In a 'Digital Omnibus', which will still face debate and votes from European countries, the Commission proposed to delay the EU's stricter rules on the use of AI in a range of areas seen as more high risk, to December 2027 from August 2026.
That includes AI use in biometric identification, road traffic applications, utilities supply, job applications and exams, health services, creditworthiness and law enforcement. Consent for pop-up 'cookies' would also be simplified.
The Digital Omnibus or simplification package covers the AI Act which became law last year, the landmark privacy legislation known as the General Data Protection Regulation (GDPR), the e-Privacy Directive and the Data Act, among others.
Proposed changes to the GDPR would also allow Alphabet's Google, Meta, OpenAI and other tech companies to use Europeans' personal data to train their AI models.
(Reporting by Supantha Mukherjee in Stockholm and Jan Strupczewski and Foo Yun Chee in Brussels)
The AI Act is a legislative framework proposed by the European Commission to regulate artificial intelligence technologies, focusing on risk management and compliance to ensure safety and ethical use.
The General Data Protection Regulation (GDPR) is a comprehensive data protection law in the EU that governs how personal data is collected, processed, and stored, enhancing privacy rights for individuals.
Biometric identification systems use unique physical characteristics, such as fingerprints or facial recognition, to verify an individual's identity, often used in security and access control.
The Digital Omnibus is a legislative proposal by the European Commission aimed at streamlining various digital regulations, including those related to AI, data protection, and privacy.
Explore more articles in the Finance category



