Get Ready for
the EU AI Act
The EU Artificial Intelligence Act is setting a global standard for AI regulation, as GDPR did for data privacy. Here, we provide an overview of the Act and guide you on how to prepare your AI systems for compliance.
Timeline and Compliance Milestones
The EU AI Act entered into force on 1 August 2024 after a three-year legislative process. Since then, key milestones have already taken effect: prohibited AI practices became enforceable in February 2025, along with mandatory AI literacy requirements for all staff handling AI systems.
As of August 2025, general-purpose AI providers must comply with transparency and documentation obligations. The next major deadline is August 2026, when high-risk AI system requirements become enforceable.
The Act officially enters into force
Prohibitions on unacceptable risk enter into force and the implementation of AI literacy requirements
Obligations for GPAI providers, as well as regulations on notifications to authorities and fines go into effect
Commission implementing act on post-market monitoring
Obligations for high-risk AI systems in biometrics, critical infrastructure, and law enforcement (Annex III)
Obligations for high-risk AI systems as safety components or products requiring third-party conformity assessment (Annex I)
Compliance for AI systems in large-scale IT systems under EU law in Freedom, Security, and Justice
The Act officially enters into force
Prohibitions on unacceptable risk enter into force and the implementation of AI literacy requirements
Obligations for GPAI providers, as well as regulations on notifications to authorities and fines go into effect
Commission implementing act on post-market monitoring
Obligations for high-risk AI systems in biometrics, critical infrastructure, and law enforcement (Annex III)
Obligations for high-risk AI systems as safety components or products requiring third-party conformity assessment (Annex I)
Compliance for AI systems in large-scale IT systems under EU law in Freedom, Security, and Justice
The Act officially enters into force
Prohibitions on unacceptable risk enter into force and the implementation of AI literacy requirements
Obligations for GPAI providers, as well as regulations on notifications to authorities and fines go into effect
Commission implementing act on post-market monitoring
Obligations for high-risk AI systems in biometrics, critical infrastructure, and law enforcement (Annex III)
Obligations for high-risk AI systems as safety components or products requiring third-party conformity assessment (Annex I)
Compliance for AI systems in large-scale IT systems under EU law in Freedom, Security, and Justice
EU AI Act: How Compliance Actually Works
The EU AI Act doesn't sort AI systems into tidy risk tiers. It runs four independent checks, and the obligations stack. A single AI system can trigger multiple gates simultaneously.
Most guides get this wrong. Here's how compliance actually works.
Prohibited Practices
Does this AI practice cross a red line?
High-Risk Systems
Is this AI used in a high-stakes domain?
Transparency
Does this AI interact with people, detect emotions, or generate synthetic media?
General-Purpose AI
Are you providing a foundation model or GPAI?
Obligations stack: one system can trigger multiple gates
Examples
High-risk (essential services) + Transparency (human interaction)
Transparency only: disclose it's AI
All three: High-risk + Transparency + GPAI obligations
The EU AI Act Covers More Than You Think
You thought you had 3 AI systems. You probably have 50. The Act's definition is broad, and most of your AI is hiding below the surface.
The average enterprise has 10x more AI systems
than they assume.
Most haven't been inventoried.
EU AI Act Definition (Article 3)
An AI system is a machine-based system designed to operate with varying levels of autonomy, that may exhibit adaptiveness after deployment and that infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments.
The EU AI Act Follows Your AI
Like GDPR, the EU AI Act is extraterritorial. It applies based on who you affect, not where you're headquartered.
The Chain of Responsibility
A real-world example of how the EU AI Act reaches across borders
Builds AI credit scoring model
System placed on EU market through value chain
Licenses model for fintech platform
Deploying high-risk AI affecting EU persons
Credit decisions made about them
Protected by the EU AI Act
Compliance Requirements
The Act lays out a range of requirements for high-risk AI systems, covering:
* Required only for public sector deployers and private deployers using high-risk AI for credit scoring or life/health insurance risk assessment.
How Modulos Helps You Meet Every Requirement
The Modulos AI Governance Platform addresses each EU AI Act obligation with purpose-built tools.
Conformity Assessments
High-risk AI systems must undergo Conformity Assessments to demonstrate compliance before market entry. This structured process ensures your AI systems meet regulatory requirements.
Step 1 - A high-risk AI system is developed
Establish, implement, document, and maintain a risk management system to address the risks posed by a high-risk AI system.
Step 2 - The system undergoes the conformity assessment and complies with AI requirements
- Implement effective data governance, including bias mitigation, training, validation, and testing of data sets.
- Maintain up-to-date technical documentation in a clear and comprehensive manner.
Once substantial changes happen in the AI system's lifecycle, repeat from Step 2.
Step 3 - Registration of stand-alone systems in an EU database.
- Ensure that high-risk AI systems allow for the automatic recording of events (logs) over their lifetime.
- Design systems to ensure sufficient transparency for deployers to interpret outputs and use appropriately.
Step 4 - A declaration of conformity is signed, and the AI system should bear the CE marking
- Develop systems to maintain an appropriate level of accuracy, robustness, and cybersecurity throughout their lifecycle.
- Ensure proper human oversight during the period the system is in use.
The system can be placed on the market.
Disclaimer:
The steps outlined above are intended to provide a general overview of the conformity assessment process. They should not be considered exhaustive and are not intended as legal or technical advice.
Understanding Roles and Responsibilities
The EU AI Act outlines specific roles and responsibilities for stakeholders in the AI system lifecycle:
Providers
Deployers
Importers
Distributors
Modifying AI Systems
Significant modifications, such as altering core algorithms or retraining with new data, may reclassify you as a provider, necessitating adherence to provider obligations.
Penalties for Non-Compliance
The EU AI Act imposes significant fines for non-compliance, calculated as a percentage of the offending company's global annual turnover or a predetermined amount, whichever is higher. Provisions include more proportionate caps on administrative fines for SMEs and start-ups.
Ensure your AI systems comply with the EU AI Act to avoid these penalties.
Request a DemoPenalty Breakdown
Non-compliance with prohibitions
Supplying incorrect, incomplete, or misleading information
Non-compliance with other obligations
Download the EU AI Act Guide
Learn how to ensure your AI systems comply with the EU AI Act. This guide provides a clear overview of the regulation, mandatory compliance requirements, and how to prepare your AI operations for these changes.
Download the GuideEU AI Act Guide
Foundations and
Practical Insights
FAQ about EU AI Act
The EU AI Act is the European Union's flagship law to regulate how AI systems should be designed and deployed. It aims to protect fundamental rights, ensure safety, and foster innovation while creating a harmonized legal framework across the EU.
The EU AI Act mandates that AI system providers based in the EU comply with the regulation. Moreover, the Act also applies to providers and deployers outside the EU whose AI systems are used in the EU market. This means organizations worldwide may need to comply if their AI products or services reach EU users.
The situation is similar to the global reach of General Data Protection Regulation (GDPR). The AI Act applies to providers outside the EU when their AI system output is used in the EU. Non-EU deployers using AI systems in the EU are also covered. This extraterritorial scope means companies worldwide must assess their AI offerings for EU compliance.
On 1 August 2024, the EU AI Act officially entered into force. The Act will become fully applicable by August 2027, with different provisions taking effect at various milestones: prohibitions on unacceptable risk (February 2025), GPAI obligations (August 2025), high-risk system requirements (August 2026-2027).
To be ready for the EU AI Act, companies will have to adhere to the extensive requirements stipulated in the regulation. Key steps include: conducting an AI systems inventory, classifying systems by risk level, implementing required documentation and risk management systems, ensuring data governance practices, and establishing human oversight mechanisms.
According to the EU AI Act, significant modifications to an AI system can change your role from a deployer to a provider, triggering additional compliance obligations. Key modifications that may reclassify you include: • Altering Core Algorithms: Changes to the fundamental logic or algorithms of the AI system. • Re-training with New Data: Using new datasets for training that substantially alter the system's performance or behavior. • Integration with Other Systems: Modifying how the AI system interacts with other hardware or software components. Implications of becoming a provider include increased responsibilities such as complying with all provider obligations under the Act, including conformity assessments, documentation requirements, and ongoing monitoring obligations.
Ensure Your AI Compliance
Whether you are already using or considering AI in your business, keeping these upcoming regulatory changes in mind is essential. Modulos can support your compliance journey.