EU AI Act12 min read

CE Marking for AI Systems: What European Providers Need to Know

CE marking has been the passport to the European single market for decades — from machinery to medical devices. Under the EU AI Act, high-risk AI systems now join that regime. This guide explains when CE marking is required, what prerequisites must be met, the step-by-step process, and how AI providers can navigate the intersection of AI-specific rules with existing product safety legislation.

··Updated April 7, 2026

1. What CE Marking Means for AI

The CE mark (Conformité Européenne) is a mandatory conformity marking for products sold within the European Economic Area. It indicates that a product meets EU health, safety, and environmental protection standards. For decades, manufacturers of machinery, toys, electronics, medical devices, and personal protective equipment have been required to affix the CE mark before placing products on the market.

With the EU AI Act (Regulation 2024/1689), artificial intelligence systems enter the CE marking regime for the first time. Article 48 of the regulation requires providers of high-risk AI systems to affix the CE marking to their system — or to its documentation or packaging where physical affixing is impractical — before the system can be placed on the EU market or put into service.

The CE mark on an AI system communicates a specific legal claim: the provider has conducted the required conformity assessment, drawn up the EU Declaration of Conformity, compiled the technical documentation mandated by Annex IV, and ensured the system meets all applicable requirements of the regulation. It is not a quality mark or a performance guarantee — it is a declaration of regulatory compliance.

Key Distinction

CE marking applies to AI systems, not AI models. A general-purpose AI model (such as a foundation model) is not CE-marked directly. However, when that model is integrated into a high-risk AI system that is placed on the market, the resulting system must carry the CE mark. The provider of the system bears the conformity obligation.

For software-based AI systems with no physical housing, the CE marking is affixed to the EU Declaration of Conformity, accompanying documents, or the digital interface. The regulation acknowledges the intangible nature of AI by permitting digital CE marks where physical affixing is not feasible.

2. When CE Marking Is Required

CE marking is only required for high-risk AI systems. The EU AI Act does not require CE marking for minimal-risk, limited-risk, or general-purpose AI systems. Understanding which systems qualify as high-risk is therefore the critical first step.

High-Risk Categories Requiring CE Marking

High-risk AI systems are defined in two ways under the regulation:

SourceDescriptionCE Marking Deadline
Annex IIIStandalone high-risk AI use cases: biometric identification, critical infrastructure, education, employment, essential services, law enforcement, migration, justice2 August 2026
Annex IAI systems that are safety components of products already covered by EU harmonization legislation (e.g., machinery, medical devices, automotive, lifts, marine equipment)2 August 2027

Exceptions

  • AI systems for military or defense purposes are excluded from the EU AI Act entirely and do not require CE marking under this regulation
  • AI systems used exclusively for research and development that are not placed on the market or put into service are exempt
  • Free and open-source AI systems are exempt unless they are high-risk systems, in which case the same obligations apply regardless of licensing model
  • AI systems listed in Annex III that do not pose significant risk may be exempt if the provider documents a justified rationale and reports to the national authority — but this exception is narrowly defined under Article 6(3)

Practical Rule of Thumb

If your AI system makes or influences decisions about people in any of the Annex III domains — or if it serves as a safety component in a product governed by existing EU product safety legislation — you almost certainly need CE marking. When in doubt, treat the system as high-risk and follow the conformity assessment process. The cost of unnecessary compliance is far lower than the cost of non-compliance penalties.

3. Prerequisites: Conformity Assessment & Declaration

Before affixing the CE mark, a provider must complete two foundational requirements: the conformity assessment and the EU Declaration of Conformity. These are not optional formalities — they are legal prerequisites without which the CE mark cannot be lawfully applied.

Conformity Assessment

The conformity assessment is the process by which a provider demonstrates that their high-risk AI system meets all requirements set out in Chapter III, Section 2 of the EU AI Act. These requirements cover:

  • Risk management system (Article 9) — continuous, iterative identification and mitigation of risks
  • Data governance (Article 10) — training, validation, and testing datasets meeting quality criteria
  • Technical documentation (Article 11, Annex IV) — comprehensive documentation of design, development, and testing
  • Record-keeping and logging (Article 12) — automatic recording of events during operation
  • Transparency and information (Article 13) — clear instructions for deployers
  • Human oversight (Article 14) — design enabling effective human oversight
  • Accuracy, robustness, and cybersecurity (Article 15) — appropriate levels of performance and resilience

The assessment can follow one of two procedures:

ProcedureAnnexApplies When
Internal controlAnnex VIMost Annex III high-risk systems (self-assessment by the provider)
Third-party assessmentAnnex VIIBiometric identification and categorisation systems (Annex III, point 1) — requires a notified body

EU Declaration of Conformity

After completing the conformity assessment, the provider must draw up an EU Declaration of Conformity in accordance with Article 47. This is a formal document stating that the AI system meets all applicable requirements. The declaration must contain:

  • The name and address of the provider (and authorized representative, if applicable)
  • A statement that the declaration is issued under the sole responsibility of the provider
  • The AI system identification (name, type, version, unique reference allowing traceability)
  • The conformity assessment procedure followed (Annex VI or Annex VII)
  • Reference to relevant harmonized standards or common specifications applied
  • Where applicable, the name and identification number of the notified body and reference to the certificate issued
  • Place and date of issue, along with the name and function of the signatory

Retention Requirement

The EU Declaration of Conformity and the supporting technical documentation must be kept for 10 years after the AI system is placed on the market or put into service. Market surveillance authorities can request these documents at any time during that period.

4. The CE Marking Process Step by Step

The following sequence outlines the complete CE marking workflow for a high-risk AI system under the EU AI Act:

1

Classify Your AI System

Determine whether your system falls under Annex III (standalone high-risk) or Annex I (safety component of regulated product). Apply the Article 6(3) exception analysis if you believe the system does not pose significant risk despite being listed in Annex III.

2

Implement All Chapter III Requirements

Establish risk management, data governance, technical documentation, logging, transparency measures, human oversight mechanisms, and accuracy/robustness/cybersecurity testing. These are substantive requirements — the conformity assessment verifies they are met.

3

Compile Technical Documentation (Annex IV)

Prepare comprehensive documentation covering the system's intended purpose, design specifications, development process, training data, testing methodology, performance metrics, risk assessment results, and monitoring plan. This documentation must be available before the conformity assessment begins.

4

Establish Quality Management System

Article 17 requires providers to have a quality management system that ensures ongoing compliance. This includes documented policies, design control procedures, data management practices, testing protocols, and post-market monitoring processes.

5

Conduct Conformity Assessment

Follow Annex VI (internal control) or engage a notified body under Annex VII. For self-assessment, verify all requirements are met and document the results. For third-party assessment, submit documentation to the notified body and undergo their audit process.

6

Draw Up the EU Declaration of Conformity

Complete the formal declaration per Article 47, identifying the system, the applicable requirements, the conformity procedure used, and any notified body involvement. The signatory takes legal responsibility for the declaration's accuracy.

7

Affix the CE Mark

Apply the CE marking in accordance with Article 48 — visibly, legibly, and indelibly on the AI system, its packaging, or accompanying documentation. For purely digital systems, include the CE mark in the digital interface and all documentation.

8

Register in the EU Database

Register the high-risk AI system in the EU-wide database established under Article 71 before placing it on the market. The registration must include the system's declaration of conformity, a summary of the technical documentation, and the CE marking status.

5. Relationship to Existing CE Marking Regimes

Many products that already carry CE marking under existing EU legislation are now incorporating AI components. The EU AI Act explicitly addresses this intersection in Article 2(2) and through the Annex I framework, which lists the existing Union harmonization legislation that triggers high-risk classification when an AI system serves as a safety component.

How Dual CE Requirements Work

When an AI system is a safety component of a product already regulated by EU harmonization legislation, the provider must comply with both regulatory frameworks. The CE mark covers both sets of requirements — there is not a separate CE mark for the AI component.

Product DomainExisting RegulationAI Act Interaction
MachineryMachinery Regulation (EU) 2023/1230AI-driven safety functions in industrial robots, CNC machines, autonomous vehicles on factory floors
Medical devicesMDR (EU) 2017/745, IVDR (EU) 2017/746AI-based diagnostic systems, treatment planning software, patient monitoring algorithms
AutomotiveVehicle Safety Regulation (EU) 2019/2144ADAS components, autonomous driving functions, driver monitoring systems
Civil aviationRegulation (EU) 2018/1139AI in air traffic management, flight control systems, drone automation
Radio equipmentRED Directive 2014/53/EUAI-powered smart home devices, IoT systems with embedded AI

Single Assessment Principle

Article 43(3) provides that where a high-risk AI system is a product or safety component of a product covered by Annex I legislation, the conformity assessment under that sectoral legislation shall also cover the AI Act requirements — provided the sectoral assessment addresses equivalent requirements. This avoids duplicating assessments while ensuring all obligations are met.

6. Withdrawal and Recall Procedures

The CE mark is not permanent. Market surveillance authorities in any EU member state can challenge it, and providers themselves have obligations to act when non-conformity is discovered. The EU AI Act establishes clear procedures for both voluntary and mandatory withdrawal.

Provider's Corrective Action Obligations

Under Article 20, when a provider determines that their high-risk AI system is not in conformity with the regulation, they must immediately take corrective measures to bring the system into conformity, withdraw it, or recall it, as appropriate. The provider must also inform:

  • Distributors, deployers, authorized representatives, and importers of the non-conformity
  • The market surveillance authority of the member state(s) where the system was made available
  • The notified body that issued a certificate (if applicable) — the certificate may need to be suspended or withdrawn

Market Surveillance Authority Actions

Under Articles 79-82, market surveillance authorities can take escalating enforcement actions:

  • Request corrective action — order the provider to bring the system into compliance within a specified timeframe
  • Restrict or prohibit — prevent the system from being made available on the market
  • Order withdrawal — require removal of the system from the supply chain
  • Order recall — require the system to be returned from deployers who already have it

Serious Incident Reporting

If a high-risk AI system causes or contributes to a serious incident (death, serious health damage, serious disruption of critical infrastructure, or serious violation of fundamental rights), the provider must report this to the market surveillance authority within 15 days of becoming aware. Failure to report can result in separate penalties on top of any conformity-related fines.

7. Mutual Recognition Across Member States

One of the core principles of CE marking — and a fundamental pillar of the EU single market — is mutual recognition. The EU AI Act, as a directly applicable regulation (not a directive requiring national transposition), establishes uniform rules across all 27 member states plus the three EEA countries.

What Mutual Recognition Means in Practice

  • No additional national requirements. Once a high-risk AI system carries a valid CE mark, no member state can impose additional conformity requirements that duplicate or contradict the EU AI Act. A system lawfully placed on the market in France cannot be blocked in Germany on grounds already covered by the regulation.
  • Notified body certificates are valid EU-wide. A conformity certificate issued by a notified body designated in any member state is recognized across the entire EU/EEA. Providers do not need separate certificates for each country.
  • EU database registration is centralized. The Article 71 database is EU-wide, so a single registration covers market placement across all member states.
  • Market surveillance cooperation. Member states must cooperate through the EU AI Board and the market surveillance framework to ensure consistent enforcement. A finding of non-conformity in one member state can trigger EU-wide action.

Practical Implications for Cross-Border Providers

For AI providers operating across multiple EU markets, the mutual recognition principle significantly simplifies compliance. The same conformity assessment, the same Declaration of Conformity, and the same CE mark serve as the single passport to 30 national markets comprising over 450 million people.

However, mutual recognition does not prevent individual member states from designating their own market surveillance authorities, establishing their own AI regulatory sandboxes, or interpreting enforcement provisions within their national context. Providers should maintain relationships with national competent authorities in their primary markets.

Non-EU Providers

Providers established outside the EU must appoint an authorized representative within the EU before placing their high-risk AI system on the market (Article 22). The authorized representative acts as the regulatory point of contact and bears certain obligations — including maintaining the Declaration of Conformity and technical documentation. The CE mark remains valid EU-wide regardless of the provider's place of establishment.

8. Frequently Asked Questions

Is CE marking mandatory for all AI systems under the EU AI Act?
No. CE marking is only mandatory for high-risk AI systems as defined in Annex III (standalone high-risk use cases) and Annex I (AI as safety component of regulated products). Minimal-risk, limited-risk, and general-purpose AI systems do not require CE marking.
Can I self-certify my AI system for CE marking?
Most high-risk AI systems listed in Annex III can use internal conformity control (self-assessment) under Annex VI. However, biometric identification and categorisation systems require third-party conformity assessment by a notified body under Annex VII, unless fully covered harmonised standards are applied.
What happens if I place an AI system on the EU market without CE marking?
Placing a high-risk AI system without proper CE marking constitutes non-compliance. Penalties can reach EUR 15 million or 3% of global annual turnover. Market surveillance authorities can also order withdrawal or recall of the non-compliant system.
Does CE marking from one EU member state apply across all member states?
Yes. CE marking grants mutual recognition across all 27 EU member states plus EEA countries. Once an AI system carries a valid CE mark, no member state can impose additional national conformity requirements or prohibit its market placement on grounds already covered by the EU AI Act.

Related in the EU AI Act Cluster

Related Topics

Harish Kumar

Harish Kumar

Founder & CEO, Quantamix Solutions B.V.

18+ years in enterprise AI across ING, Rabobank (€400B+ AUM), Philips, Amazon Ring, Deutsche Bank, and Reserve Bank of India. FRM, PMP, GCP certified. Patent holder (EP26162901.8). Published researcher (SSRN 6359818). Building traceable, auditable AI for regulated industries.