European Union member states and European Parliament negotiators have reached a provisional agreement that softens parts of the EU’s landmark artificial intelligence rules within the European Commission’s so-called digital omnibus package. The move delays some obligations for high-risk AI systems to reduce legal uncertainty, negotiators said after late-night talks.
Under the compromise, requirements for high-risk AI applications — such as biometric identification, systems linked to critical infrastructure, education, employment, law enforcement and border management — are postponed to December 2, 2027, instead of the earlier August 2 deadline this year. Separately, machinery has been removed from the scope of the AI Act and will continue to be governed by existing sectoral safety rules.
The agreement also includes a mandatory watermarking requirement for AI-generated content, which will take effect from December 2. At the same time, negotiators agreed to ban AI tools that create unauthorised sexually explicit deepfakes — covering images, video and audio — with particular intent to prevent material depicting child sexual abuse. Companies will have until December 2 this year to bring affected systems into compliance.
Supporters say the changes reduce recurring administrative costs for businesses and give firms clearer, more harmonised rules across the EU, strengthening digital sovereignty and competitiveness, according to Marilena Raouna, Cyprus’s Deputy Minister for European Affairs. Critics argue the concessions reflect pressure from industry and water down initial protections.
Lawmakers described the measures as strengthening safeguards for children and demonstrating that EU institutions can act quickly when harms emerge. The amendments are part of a broader effort by the European Commission to simplify its digital rulebook.
The provisional deal still needs formal approval by the European Parliament in plenary and by EU governments, steps that officials described as largely procedural before the new provisions take effect.