While most EU AI Act coverage focuses on high-risk systems and the August 2026 deadline, there is one provision that already applies to every company using AI in the EU — and most have missed it.
Article 4: AI Literacy has been in force since February 2, 2025. If your organisation provides or uses AI systems in the EU, you are already required to ensure your staff has a sufficient level of AI literacy.
What Article 4 actually says
The text is short but broad:
Providers and deployers of AI systems shall take measures to ensure, to the best extent possible, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context in which the AI systems are to be used, and considering the persons or groups of persons on whom the AI systems are to be used.
Three things to notice:
- Both providers and deployers must comply. If you build AI or use AI, this applies to you.
- "To the best extent possible" — this is a proportionality standard, not an absolute requirement. But "we didn't try" is not a defence.
- Context matters — a data scientist working on a hiring algorithm needs deeper literacy than a receptionist using a chatbot to schedule meetings.
Who needs to comply
Article 4 applies to:
- Providers — companies that develop AI systems or have them developed and place them on the EU market under their name
- Deployers — companies that use AI systems under their own authority (not providers, but organisations that purchase and deploy AI)
- Their staff and other persons dealing with the operation and use of AI systems on their behalf — this includes contractors, freelancers, and outsourced teams
This is one of the broadest provisions in the AI Act. It applies regardless of the risk level of your AI system. Even if your AI is classified as minimal risk and has no other obligations, Article 4 still applies.
What "AI literacy" means in practice
The AI Act defines AI literacy in Recital 20 as:
...the skills, knowledge and understanding that allow providers, deployers and affected persons, taking into account their respective rights and obligations in the context of this Regulation, to make an informed deployment of AI systems.
In practical terms, people who work with AI systems should understand:
- What the AI system does and what it does not do
- How it makes decisions (at an appropriate level of abstraction for their role)
- Known limitations and failure modes — when the system is unreliable or biased
- Their responsibilities under the AI Act depending on their role
- How to interpret the system's outputs and when to override them
- When to escalate — recognising situations where the system performs outside its intended scope
What you need to do
There is no prescribed training programme. The regulation gives you flexibility. But you need to be able to demonstrate you took measures. Here is what reasonable compliance looks like:
1. Assess who interacts with AI
Map out which roles in your organisation develop, deploy, manage, or use AI systems. This includes:
- Engineering and data science teams (providers)
- Product managers who define AI system behaviour
- Operations staff who use AI tools daily
- Customer-facing staff who relay AI outputs to users
- Management who make decisions based on AI recommendations
- Procurement teams who evaluate and purchase AI tools
2. Define literacy levels by role
Not everyone needs the same depth. A useful framework:
| Role | Literacy level needed |
|---|---|
| AI/ML engineers | Deep technical + regulatory awareness |
| Product managers | System capabilities, limitations, regulatory requirements |
| Operations staff using AI | How to interpret outputs, when to override, escalation paths |
| Executives | Strategic risks, compliance obligations, liability |
| Procurement | Vendor compliance assessment, contract requirements |
3. Deliver training
Options include:
- Internal workshops on your specific AI systems and their documentation
- External training on the EU AI Act (many law firms and consultancies now offer these)
- Documentation — clear internal guides on each AI system, its purpose, and its limitations
- Vendor-provided training on AI tools you use (deployers should request this from providers)
- E-learning modules covering AI fundamentals and regulatory basics
4. Document everything
Keep records of:
- What training was delivered, to whom, and when
- Training materials and content
- Assessment results if applicable
- How literacy requirements were calibrated to each role
Documentation is your evidence that you took measures "to the best extent possible."
Penalties for non-compliance
Article 4 violations fall under the general penalty tier in Article 99(4): fines of up to 7.5 million or 1.5% of global annual turnover, whichever is higher.
For SMEs and startups, the regulation specifies that penalties should be proportionate. But "we didn't know about this requirement" is exactly the kind of gap Article 4 is designed to prevent.
The deadline has already passed
Unlike the high-risk system requirements (August 2, 2026), AI literacy has been required since February 2, 2025. If you haven't started, you are already behind. The good news is that this is one of the simpler provisions to address — it doesn't require technical infrastructure changes, just training and documentation.
The Digital Omnibus: is Article 4 changing?
In December 2025, the European Commission proposed the Digital Omnibus on AI, which would soften Article 4 from a binding obligation to an encouragement. However, the European Parliament voted on March 26, 2026 to adopt a compromise position: keeping the mandatory obligation but lowering the standard from ensuring "a sufficient level of AI literacy" to "supporting the improvement of AI literacy."
Trilogue negotiations with the Council are ongoing, targeting May 2026 to finalise the text before the August 2, 2026 general application date.
What this means for you: Article 4 is still mandatory today. Even if the wording softens slightly, the core obligation — take reasonable measures to ensure your staff understands the AI systems they work with — is not going away. Companies that have already implemented training programmes will be well positioned regardless of how the final text reads.
Common questions
Does this apply to companies outside the EU? Yes, if you are a provider placing AI systems on the EU market or a deployer located in the EU. Read more about extraterritorial scope.
Do I need to train everyone in the company? Only staff and persons "dealing with the operation and use of AI systems." If someone never touches AI, they are not covered. But in 2026, that is an increasingly small group.
Is there an official certification or standard for AI literacy? Not yet. The Commission may issue guidance, and standards bodies (CEN/CENELEC) are working on related standards. For now, the requirement is principle-based — take reasonable, proportionate measures.
Can we use AI tools to deliver AI literacy training? Yes, and there is a certain elegance to it. Just make sure the humans involved actually learn something.
Start now
AI literacy is the lowest-effort, highest-return compliance task under the EU AI Act. It costs relatively little, protects your organisation, and makes every other compliance activity easier because your team actually understands what the regulation requires.
If you are unsure whether your AI system carries additional obligations beyond Article 4, take Annexa's free risk assessment quiz to find out your classification in under 2 minutes.