✓ Free & Fast.
✓ Simply & Self-Driven.
✓ Clear & Accessible.
Robotic hand interacting with a digital network, symbolising AI innovation and compliance under the EU AI Act August 2025.

The EU AI Act officially came into force in June 2024. While many of its requirements will be phased in through 2025 to 2027, several rules already apply and they matter today. 

Many organisations assume that the AI Act is something to worry about later. That’s a risky assumption. Enforcement has already begun for specific areas, and regulators expect companies to start preparing now. 

In this article, we break down what’s already required under the AI Act, why these early obligations matter, and how you can quickly assess your current exposure. 


A Quick Recap: What Is the EU AI Act? 

The EU AI Act is the world’s first major regulation focused specifically on artificial intelligence. Its goal is to ensure that AI-systems used in the EU are safe, transparent, fair and trustworthy. 

The Act applies not only to AI developers and vendors, but also to organisations that use AI-systems, even if they didn’t build the technology themselves. And this group is rapidly increasing.  


What Rules Already Apply? 

As of 2024, the following four areas are already enforceable under the EU AI Act:  

1. Prohibited AI Systems

Certain types of AI-systems are no longer allowed under any circumstance. This includes AI that: 

  • Manipulates human behaviour in harmful ways 
  • Exploits vulnerable groups (such as children) 
  • Uses biometric categorisation or social scoring 
  • Conducts facial recognition in public spaces (except under strict conditions) 

If your organisation uses or plans to use any of these technologies, action is required immediately. 


2. Transparency Obligations

AI-systems that interact with users or generate content must be clearly disclosed. For example: 

  • Users must be informed when they’re speaking to a chatbot 
  • Synthetic audio, video or images (e.g. deepfakes) must be labelled as such 
  • AI-generated content must not be misleading 

Even general-purpose AI-tools may fall under this requirement depending on how they’re deployed. 


3. Prepare for the future

The EU is clear: companies must start preparing for future compliance now. Regulators are unlikely to accept “we didn’t know” as an excuse once the next phase of obligations becomes enforceable. 

That means assessing your systems, understanding your role, and taking steps toward responsible AI governance today, not in 2026 or later. 


4. New Obligations Since August 2025

Organisations that develop or use general-purpose AI models, like foundation models or LLMs, now face specific requirements under the AI Act. These include:

  • Technical documentation and transparency
  • Copyright compliance for training data
  • Risk assessment and reporting (for high-impact models)
  • Clear user guidance and usage instructions

From now on, national enforcement authorities can audit and penalise non-compliance. If your organisation builds, integrates or distributes AI systems, these rules likely affect you directly.


Why This Matters Now 

Even if you’re not building AI, you might still be legally responsible for how it’s used within your organisation. 

Examples: 

  • HR uses AI for screening or shortlisting candidates 
  • Customer service uses AI-chatbots 
  • Marketing uses generative AI for copy or content 
  • Finance or operations use scoring or forecasting models 

Without clear oversight and documentation, these systems could expose your organisation to: 

  • Legal risk and regulatory enforcement 
  • Reputational damage 
  • Loss of trust with users or partners 
  • Risk of Heavy Penalties

How SimplyComplai Helps 

At SimplyComplai, we help organisations cut through the complexity of the AI Act, even if you’re just starting out. Our simple, self-assessment approach lets you quickly understand your current compliance status and pinpoint exactly what still needs attention, so you can avoid the immediate cost of bringing in an expensive consultant.

👉 The AI Compliance Scan is a free, fast way to: 

  • Discover which AI-provider or user you are 
  • Understand what legal obligations may already apply 
  • Identify next steps for reducing risk and staying compliant 

It only takes a few minutes and gives you clarity when it matters most. 

Meanwhile we are working very hard on our Gap Analysis Tool will support you in discovering where you already comply and what still needs to be done. If you need help, we can support you.

The EU AI Act isn’t just a future concern, it’s here, and the first rules are already being enforced. Starting early gives your organisation a head start in managing AI responsibly, building trust, and avoiding surprises. 

Still not sure what applies to you? Let us help you make AI compliance simple. 


📚 Want to learn more? Read our latest articles or check out our FAQ section.

💡 Have specific questions about AI compliance? Get in touch with us.

en_USEN
Scroll to Top