Conformity Assessment & CE Marking — Art. 43–48

Articles 43-48 of the EU AI Act establish conformity assessment procedures for high-risk AI systems before market placement. Upon successful assessment, providers draw up an EU declaration of conformity (Art. 47) and affix CE marking (Art. 48). This module manages assessments, CE marking, and includes a GPAI models sub-register (Title IIIA).

Types of Conformity Assessment

Three assessment procedures with varying rigour, based on AI system nature and domain:

TypeArt. ReferenceWhen RequiredDescription
Internal AssessmentArt. 43(1), Annex VIDefault for most Annex III high-risk systemsProvider conducts assessment internally without external involvement. Requires a quality management system (Art. 17), complete technical documentation (Art. 11/Annex IV), and systematic verification of compliance. Most common type — applies to the majority of high-risk AI systems. Must be thorough, documented, and auditable.
Third-Party AssessmentArt. 43(1), Annex VIICertain high-risk systems in specific domainsIndependent third-party body reviews technical documentation, evaluates the QMS, and may conduct testing. Provides additional assurance for systems with particularly severe potential impact. The body issues a report documenting findings.
Notified Body AssessmentArt. 43(2), Annex VIIBiometric ID systems (Annex III, §1) for law enforcement, migration, border controlMost rigorous assessment by a Member State-designated notified body listed in NANDO. Comprehensive evaluation including testing, potential modifications, and unannounced surveillance audits. Issues a certificate valid up to 5 years, subject to periodic surveillance.

List View

Step 1 — Open the Conformity List

Navigate to EU AI Act → Conformity & CE Marking. Each row shows AI system, type badge, status, valid-until date, CE marking badge, and notified body name. Approaching expiration: amber; expired: red.

Step 2 — Filter by Status

Filter by status: Not Started, In Progress, Completed, Expired, or Failed. Focus on In Progress to drive completion and Expired to initiate reassessment.

Step 3 — Filter by Type

Use the Type dropdown to filter by assessment type: Internal Assessment, Third-Party Assessment, or Notified Body Assessment. This helps when reviewing assessment coverage across your AI portfolio — for example, verifying that all biometric identification systems used in law enforcement have notified body assessments as required by Art. 43(2).

Creating a New Conformity Assessment

Click + Add Assessment to open the creation form. Complete all required fields:

FieldTypeRequiredDescription
AI SystemDropdownRequiredSelect the AI system undergoing assessment. An AI system may have multiple assessments (initial, post-modification, periodic). All are retained for audit trail.
TypeDropdownOptionalSelect the assessment type. Selecting Third-Party or Notified Body reveals additional fields for external body details (name, ID, declaration reference).
Valid UntilDate pickerOptionalExpiration date of the assessment/certificate. Internal: set per your policy (recommended annually). Notified body: per certificate (up to 5 years). The dashboard tracks approaching expiration 90 days before this date.
NotesTextareaOptionalDocument assessment scope, methodology, findings, non-conformities and resolutions, and references to supporting evidence. Maximum 5,000 characters.
Notified Body NameText inputOptionalOfficial name of the notified/third-party body (e.g., "TUV SUD AG", "Bureau Veritas S.A."). Conditional — appears for Third-Party and Notified Body types. Must be listed in the NANDO system.
Notified Body IDText inputOptionalFour-digit NANDO identification number (e.g., "0123"). Required in the EU declaration of conformity. Conditional for Third-Party and Notified Body assessments.
Declaration ReferenceText inputOptionalReference number of the declaration of conformity (Art. 47) or notified body certificate. For internal assessments use your numbering scheme (e.g., "DoC-AI-2026-001"). Required for Art. 49 EU database registration.

CE Marking — Art. 48

The CE marking is a visual and legal indicator that a high-risk AI system conforms with the requirements of the EU AI Act. It must be affixed to the AI system or, where that is not possible, to its packaging or accompanying documentation, in a visible, legible, and indelible manner before the system is placed on the market or put into service.

CE Marking Requirements: Must be: (a) affixed visibly, legibly, and indelibly; (b) accompanied by the notified body ID where applicable; (c) applied before market placement; (d) accompanied by the EU declaration of conformity. Indicates the system has undergone conformity assessment and meets regulatory requirements.

CE Marking Badge

Each conformity assessment record displays a CE marking badge indicating the current marking status:

  • Applied (Green Badge) — The CE marking has been affixed to the AI system and/or its accompanying documentation. This indicates that the conformity assessment has been successfully completed, no unresolved non-conformities remain, and the provider has drawn up the EU declaration of conformity. Record the date of marking application in the notes field for traceability.
  • Not Applied (Grey Badge) — The CE marking has not yet been applied. This is the default status for new assessments and for assessments that are still in progress or have failed. CE marking should only be applied after the conformity assessment is successfully completed and all non-conformities are resolved.
Warning — Premature CE Marking: Never apply CE marking before assessment completion. Violation of Art. 48 can result in market withdrawal orders, fines up to 15M EUR or 3% of global turnover, and reputational damage.

GPAI Models — General-Purpose AI Register

Title IIIA of the EU AI Act introduces specific obligations for providers of General-Purpose AI (GPAI) models — AI models trained with a large amount of data using self-supervision at scale that display significant generality and can competently perform a wide range of distinct tasks. The module includes a dedicated sub-register for tracking GPAI models that your organisation provides, deploys, or integrates.

GPAI Model Fields

FieldTypeRequiredDescription
NameText inputRequiredThe name of the GPAI model (e.g., "GPT-4o", "Claude Opus 4", "Gemini Ultra", "Llama 3.1 405B", or your organisation's custom fine-tuned model name). Maximum 300 characters. The name should match the name used in external documentation, provider communications, and API references to ensure clear identification. If you use multiple GPAI models, maintain a separate record for each.
ProviderText inputOptionalThe name of the GPAI model provider (e.g., "OpenAI", "Anthropic", "Google DeepMind", "Meta", "Mistral AI"). If your organisation developed the model, enter your organisation's name. This field is important for distinguishing between models you provide (carrying full GPAI provider obligations) and models you integrate from third parties (where your obligations relate to ensuring adequate documentation from the provider). Tracking providers supports supply chain governance.
Model TypeDropdownOptionalSelect the GPAI model classification under Title IIIA:
  • General Purpose — Standard GPAI model. Baseline obligations: maintain technical documentation, provide info to downstream providers, comply with copyright law, publish training data summary.
  • Systemic Risk — High-impact GPAI model (Art. 51), classified based on compute (≥10^25 FLOPs), benchmarks, or Commission designation. Additional obligations: model evaluations with adversarial testing, systemic risk assessment/mitigation, serious incident reporting, cybersecurity, and energy consumption reporting.
  • Open Source — Publicly available parameters/weights. Reduced obligations (documentation and copyright only). Exception does not apply if model poses systemic risk.
StatusDropdownOptionalThe current status of the GPAI model within your organisation: Active (currently in use or being provided), Deprecated (being phased out, replacement identified), Retired (no longer in use or provided), or Evaluation (being assessed for potential adoption or integration). Track status changes through the audit log to maintain a complete history of GPAI model usage in your organisation.
VersionText inputOptionalThe specific version of the GPAI model (e.g., "4o-2024-08-06", "3.5-sonnet-20241022", "1.5-pro-002"). Version tracking is particularly important for GPAI models because different versions may have significantly different capabilities, risk profiles, and compliance characteristics. When a provider releases a new version, assess whether the changes constitute a substantial modification requiring updated documentation or risk assessment. Create new records for major version changes to maintain clear version history.
Tip — GPAI Integration Mapping: If your AI systems use GPAI models as components (e.g., a customer service application built on a commercial LLM, or a document analysis tool using an embedding model), register each GPAI model in this sub-register and reference it in the corresponding AI system's technical documentation. This creates a clear mapping between your AI systems and the underlying GPAI models, demonstrating awareness of supply chain obligations. When a GPAI provider updates their model or terms of service, you can quickly identify which of your AI systems are affected by searching the GPAI register.
Provider vs. Deployer Obligations: As a GPAI provider, you bear full Title IIIA obligations. As a downstream provider/deployer: verify you have adequate documentation from the GPAI provider, maintain integration records, comply with your AI system's own risk classification obligations, and include GPAI model details in your Annex IV documentation.

Assessment Lifecycle

Step 1 — Initiation

Create the assessment record. Select the type based on risk classification and domain. Engage notified body or third party if required.

Step 2 — Documentation Preparation

Gather required documentation: technical docs (Art. 11/Annex IV), QMS (Art. 17), risk management (Art. 9), data governance (Art. 10), human oversight (Art. 14), and testing reports (Art. 15).

Step 3 — Assessment Execution

Conduct assessment per Annex VI (internal) or Annex VII (third-party/notified body). Document all findings with evidence references.

Step 4 — Non-Conformity Resolution

Address non-conformities with corrective actions. Retest and verify effectiveness. For notified body assessments, re-evaluation may be required.

Step 5 — Declaration & CE Marking

Draw up the EU declaration of conformity (Art. 47), affix CE marking (Art. 48), update record to "Completed" with CE Marking "Applied", and register in the EU database (Art. 49).

Step 6 — Post-Assessment Surveillance

Maintain compliance via post-market monitoring (Art. 72). Cooperate with surveillance audits. Schedule reassessment before expiry. Monitor for substantial modifications triggering reassessment (Art. 43(4)).

Warning — Substantial Modification: Art. 43(4) requires a new conformity assessment after any substantial modification — a change affecting compliance or intended purpose. Common triggers: significant retraining, architecture changes, expanded purpose, new deployment contexts, or new GPAI model integration. Monitor proactively and reassess promptly.