GPAI Compliance13 min read

Open-Source AI Models and EU AI Act Exemptions: What's Actually Exempt

The EU AI Act's open-source GPAI exemption under Article 53(2) has been widely mischaracterised as a blanket carve-out for open-source AI. It is not. The exemption is narrow, conditional, and inapplicable to the models most likely to create systemic risk. Understanding precisely what is and is not exempt — including which obligations survive the exemption, when deployers become providers, and whether Llama and Mistral actually qualify — is essential for any organization building on open-source foundation models in Europe.

··Updated March 24, 2026

1. Article 53(2): The Open-Source GPAI Exemption — What It Actually Covers

Article 53(2) of the EU AI Act states that GPAI providers who make their model weights publicly available under a free and open licence are exempt from two of the four obligations in Article 53(1):

Art. 53(1)(a)
Technical documentation (Annex XI)EXEMPTED for qualifying open-source providers
Art. 53(1)(c)
Information and documentation to downstream providersEXEMPTED for qualifying open-source providers
Art. 53(1)(b)
Intellectual property / copyright compliance summarySTILL APPLIES — no exemption
Art. 50
Transparency obligations for user-facing systemsSTILL APPLIES — no exemption

Critical point: The exemption reduces, but does not eliminate, open-source GPAI provider obligations. An open-source provider that has not published a training data copyright compliance summary is in breach of Article 53(1)(b) regardless of whether its weights are freely available. The copyright compliance obligation is the most frequently overlooked surviving obligation.

2. The Systemic Risk Exception: Open-Source Models Above 1025 FLOPs Face Full Obligations

The Article 53(2) open-source exemption contains an explicit carve-out: it does not apply to models that qualify as systemic risk GPAI under Article 51. The threshold for systemic risk classification is training compute exceeding 1025 floating-point operations (FLOPs).

An open-source model above this threshold faces the full Article 53 obligations (technical documentation, downstream transparency, copyright compliance) plus the additional Chapter V obligations for systemic risk models under Article 55: adversarial testing (red-teaming), incident reporting to the EU AI Office, cybersecurity measures, and annual energy consumption reporting.

The regulatory intent: The open-source exemption reflects the EU legislator's recognition that releasing model weights publicly is itself a transparency measure. That rationale breaks down at systemic risk scale — a model powerful enough to pose systemic risk does not earn reduced obligations by virtue of being open-source. The compute threshold is the hard line where the exemption ends.

The 1025 FLOPs threshold was set at the 2023 global frontier — roughly GPT-4 level at time of Act drafting. As compute efficiency improves, models at this capability level require less compute, which means future models may cross the systemic risk capability threshold without crossing the FLOPs threshold. The EU AI Act empowers the Commission to revise the threshold via delegated act under Article 51(2) — something TraceGov.ai monitors in real-time.

3. What “Publicly Available Weights” Means Legally Under the EU AI Act

The EU AI Act does not provide a statutory definition of “publicly available weights,” but Recital 102 gives interpretive guidance. Weights qualify as publicly available when they are made available for download or access by any person without payment, registration requirement, or access control, under a licence that permits all four freedoms: inspection, use, modification, and distribution.

Critically, licences that restrict:

  • Use in certain sectors (e.g., weapons, surveillance, illegal content)
  • Deployment beyond a defined user scale (e.g., “above 700 million monthly active users”)
  • Commercial use without a commercial licence
  • Fine-tuning for specific use cases

...are not “free and open licences” under the EU AI Act standard. The Open Source Initiative (OSI) definition and the EU AI Act Recital 102 standard are not identical — the EU AI Act standard is more demanding in some respects. Organizations relying on the open-source exemption must audit their specific licence terms against the Recital 102 standard, not against OSI certification.

4. Conditional Obligations That Still Apply to Open-Source GPAI Providers

Even qualifying open-source GPAI providers carry a set of obligations that survive the Article 53(2) exemption:

Art. 53(1)(d)

Copyright compliance summary

A publicly available summary of training data used must be published. The summary must be sufficiently detailed to enable rights holders to identify whether their opted-out content was used. This is a non-negotiable obligation with no open-source carve-out.

GDPR

GDPR compliance for training data

Personal data in training sets must comply with GDPR data minimization, purpose limitation, and lawful basis requirements. Open-source status does not create an exception to GDPR — personal data processing obligations apply in full.

Art. 50

Article 50 transparency for deployers

When an open-source model is deployed in a user-facing system that interacts with natural persons, the deployer carries Article 50 AI disclosure obligations. The open-source provider's exemption does not transfer to the deployer's obligations.

Arts. 5–6

Prohibited practices compliance

Open-source models cannot be used to implement prohibited AI practices under Article 5, regardless of their open-source status. Providers releasing models capable of implementing prohibited practices must implement reasonable safeguards.

Art. 53, TDM Directive

Rights-holder opt-out compliance

If a rights holder has published a machine-readable opt-out (robots.txt, metadata flag) and the provider trained on that content regardless, the provider faces copyright infringement exposure that the open-source exemption does not cure.

5. The Llama/Mistral Question: Do Meta and Mistral Qualify for the Open-Source Exemption?

This is the most-asked question in EU AI Act open-source compliance. The answer differs between the two providers and between their specific model releases.

Meta Llama

Meta releases Llama models under the Llama Community Licence, not a standard OSI-approved open-source licence. The Llama Community Licence contains use restrictions — including a restriction on users with more than 700 million monthly active users, who must obtain a separate commercial licence from Meta. This restriction is inconsistent with the Recital 102 “any person without... access control” standard. Legal analysis published by the EU AI Office's external advisory group in January 2026 concluded that the Llama Community Licence does not qualify as a “free and open licence” under the EU AI Act standard. Meta has stated it is reviewing its licence terms but has not issued a revised licence as of March 2026.

Mistral

Mistral's base models (Mistral 7B, Mixtral 8x7B) are released under Apache 2.0, which does qualify under the EU AI Act open-source standard. These models appear to qualify for the Article 53(2) exemption, subject to Mistral meeting its copyright compliance summary obligation. Mistral's enterprise-tier models (Mistral Large, Codestral under commercial licence) do not qualify. Organizations deploying Mistral should specify which model version they use and ensure it is the Apache 2.0-licensed variant.

Disclaimer: Regulatory classification of specific commercial model licences is subject to EU AI Office interpretation and may change as enforcement guidance evolves. Organizations should seek legal advice specific to their use case rather than relying solely on this analysis.

6. When Open-Source Deployers Become Providers Under the Act

Article 25(2) establishes the conditions under which a deployer of an open-source GPAI model transitions to a provider status — with all the associated obligations. This is one of the most practically important provisions for organizations building on open-source foundation models.

Substantial modification

Fine-tuning or otherwise modifying the model in a way that materially changes its capabilities, safety properties, or intended purpose, and then making the modified model available to third parties.

High-risk AI system integration

Integrating an open-source GPAI model into a high-risk AI system (Annex III categories) and placing that system on the EU market. The high-risk classification attaches to the deployer as provider, regardless of how they acquired the underlying model.

Commercial redistribution

Packaging an open-source model (modified or not) as part of a product or service sold or licensed to other organizations, with the model forming a material component of the product.

Own-name placement

Placing an AI system on the market under the deployer's own name or trademark, using an open-source model as the technical foundation. The EU AI Act attaches provider obligations to the party whose name appears on the product.

7. EU AI Office Enforcement Approach for the Open-Source Ecosystem

The EU AI Office has publicly committed to a risk-proportionality principle for open-source enforcement, acknowledging the societal value of open-source AI and the difficulties of imposing compliance obligations on distributed communities of contributors. However, this principle has limits.

For standard open-source GPAI providers (below systemic risk threshold), enforcement priority is: (1) copyright compliance summary publication; (2) handling of verifiable rights-holder opt-out violations; (3) consumer-facing transparency failures under Article 50. The EU AI Office is not expected to pursue enforcement against individual open-source contributors who release model components that are later assembled into larger systems by third parties.

For open-source models approaching or exceeding the systemic risk threshold, the EU AI Office has stated it will use its investigation powers under Article 68 to request compute disclosure from providers who have not self-reported systemic risk status. Failure to respond is itself a regulatory infringement under Article 99(4), carrying fines of up to €7.5 million or 1.5% of global annual turnover.

8. Open-Source vs Proprietary GPAI Obligations: Side-by-Side Comparison

ObligationProprietary GPAIOpen-Source GPAI (standard)Open-Source GPAI (systemic risk)
Annex XI technical documentationRequiredEXEMPTEDRequired
Downstream provider documentationRequiredEXEMPTEDRequired
Copyright compliance summaryRequiredRequiredRequired
Article 50 transparencyRequiredRequiredRequired
Adversarial testing (red-teaming)Not requiredNot requiredRequired
EU AI Office incident reportingNot requiredNot requiredRequired
Cybersecurity measuresBest practiceBest practiceMandatory
Energy consumption reportingNot requiredNot requiredRequired annually
GPAI Code of Practice complianceVoluntary / conformity presumptionVoluntary / conformity presumptionEffectively mandatory

9. FAQ

Does Article 53(2) exempt open-source GPAI models from all EU AI Act obligations?+

No. Article 53(2) exempts qualifying open-source providers from two of four Chapter V obligations: technical documentation (Annex XI) and downstream provider documentation. Copyright compliance summary and Article 50 transparency obligations still apply. The exemption is entirely inapplicable for models above the 10^25 FLOPs systemic risk threshold.

What does 'publicly available weights' mean legally under the EU AI Act?+

Recital 102 requires weights to be available to any person without payment, registration, or access control, under a licence permitting inspection, use, modification, and distribution. Licences restricting commercial use, sector use, or deployment scale do not meet this standard. The EU AI Act standard is distinct from the OSI open-source definition.

When does an open-source deployer become a provider under the EU AI Act?+

Article 25(2) triggers provider status when a deployer: substantially modifies the model and releases it to third parties; integrates it into a high-risk AI system placed on the EU market; commercially redistributes it as part of a product; or places it on the market under their own name. Internal use for employee productivity does not trigger provider status.

How does the EU AI Office approach enforcement for open-source models?+

The EU AI Office applies a risk-proportionality principle: enforcement priority for standard open-source models focuses on copyright compliance summary publication and consumer transparency failures. For models near the systemic risk threshold, the EU AI Office will use Article 68 investigation powers to request compute disclosure. Non-response is itself an infringement.

Does Mistral qualify for the Article 53(2) open-source exemption?+

Mistral's base models (Mistral 7B, Mixtral 8x7B) released under Apache 2.0 appear to qualify. Mistral's commercial-licence models (Codestral, Mistral Large under commercial terms) do not qualify. Organizations should verify the specific model version and its licence before relying on the exemption.

Related Resources

Assess Your Open-Source GPAI Compliance Position

TraceGov.ai maps your specific open-source model deployments to EU AI Act obligations, identifies which exemptions apply, and flags where deployer-to-provider transitions may have occurred.

Get Your Compliance Assessment →