140 bis Rue de Rennes, 75006 Paris, France

+33 (0)6 98 56 51 31

How to Stay Compliant with the EU AI Act

Compliance with EU AI Act

Artificial intelligence is transforming industries at an unprecedented pace, offering innovative solutions across healthcare, finance, education, and beyond. However, these advancements also bring risks, from unintentional bias and privacy breaches to systemic harm.   The EU AI Act provides the first comprehensive regulatory framework worldwide, ensuring that AI technologies are developed and deployed responsibly.  This blog provides with a summary for EU AI Act, timeline for implementation, and specifics for HRAIS, General Purpose AI Models and Foundation Models. Why Do We Need Laws on AI? As AI technologies become increasingly powerful and widespread, they bring both tremendous opportunities and significant risks. Without clear regulations, AI systems could be misused, cause unintentional harm, or operate in ways opaque to users and regulators.    The EU AI Act establishes a clear, risk-based framework to protect individuals, businesses, and society from threats such as bias, misinformation, and privacy breaches. It also builds trust and accountability, ensuring AI developers and deployers follow ethical and legal standards.  Who Is Affected by the EU AI Act? The EU AI Act applies to all actors in the AI ecosystem: providers, deployers, importers, distributors, and product manufacturers.   In practice, anyone developing, using, importing, distributing, or manufacturing AI systems in the EU falls within its scope.  Importantly, the EU AI Regulations also have extraterritorial reach: providers and deployers located outside the EU must comply if their AI systems are intended for use within the EU.  The regulation defines “AI systems” broadly, covering machine learning, statistical approaches, and symbolic reasoning. This ensures that both advanced generative models and traditional rule-based AI are included.  Timeline for Implementation The EU AI Act entered into force on 1 August 2024 and becomes fully applicable on 2 August 2026. A phased rollout applies: Rule Date Prohibitions on certain AI practices, plus obligations related to AI literacy, come into force 2 Feb. 2025 Governance framework and obligations for general-purpose AI (GPAI) models apply. 2 Aug. 2025 Providers of high-risk AI systems embedded into regulated products have a longer transition period to comply. 2 Aug. 2027 Understanding the Risk-Based Framework The EU AI Act categorizes AI systems into four levels of risk:  Unacceptable risk: certain practices are strictly prohibited, such as manipulative or deceptive AI, social scoring, untargeted facial recognition scraping, or emotion recognition in workplaces and schools.    High risk: systems with serious implications for safety or fundamental rights, such as AI in medical devices, recruitment, education, law enforcement, border control, or judicial decision support. These are subject to the strictest obligations.    Limited risk: systems like chatbots or generative AI tools must meet transparency requirements, ensuring users know when they interact with AI.    Minimal or no risk: most everyday applications, like video games or spam filters, which are exempt from regulatory requirements.  Compliance Requirements for High-Risk AI Systems Providers of high-risk AI systems (HRAIS) must implement safeguards throughout the system’s lifecycle. These include:  Establishing a risk management system to identify and mitigate risks. Ensuring strong data governance and high-quality training datasets.    Preparing detailed documentation and logging mechanisms.  Providing transparency about the system’s capabilities and limitations.    Guaranteeing human oversight, so operators can supervise and intervene if needed.    Ensuring accuracy, robustness, and cybersecurity against errors and threats.    Setting up a quality management system for internal compliance.    Conducting post-market monitoring and reporting serious incidents within 15 days.  Before deployment, providers must complete a conformity assessment, affix CE marking, and register their system in the EU’s central database.  Deployers (users) also face obligations: in some cases, they must conduct a fundamental rights impact assessment (FRIA), follow the provider’s instructions, monitor operation, and keep system logs.  Bringing a High-Risk AI System to Market The compliance process follows four main steps:    Develop the system – design with risk and compliance in mind.    Conformity assessment – verify compliance, sometimes with the involvement of a notified body.    Registration – record the system in the EU database.    Declaration and CE marking – sign a declaration of conformity before market launch.    Any substantial modification to the AI system requires reassessment.  General Purpose AI and Foundation Models General-purpose AI (GPAI) models like ChatGPT, Gemini, or DALL·E face obligations mainly around transparency: preparing technical documentation, ensuring compliance with copyright law, and providing summaries of training data.  The Act also introduces rules for foundation models trained on large datasets. Some of these are designated as systemic risk foundation models, given their potential impact across multiple sectors. They face enhanced obligations, including rigorous model testing (such as red-teaming), systemic risk assessments, detailed regulatory reporting, strong cybersecurity, and even monitoring of energy efficiency.  Penalties for Non-Compliance Non-compliance with the EU AI Act carries heavy sanctions: fines of up to €35 million or 7% of global annual turnover, depending on the type and severity of the violation. This makes compliance not just a legal necessity but a business-critical priority.  Beyond the AI Act: Interplay with Other EU Laws The EU AI Act does not exist in isolation. It interacts with other key frameworks, including the GDPR (data protection), the Cyber Resilience Act (security), and the Product Safety & Machinery Regulation (for AI embedded in physical goods).   You can read more about how to stay compliant with the GDPR here. A successful compliance strategy must therefore be holistic, covering all these areas.  The EU AI Act is a landmark regulation: the first of its kind to establish a risk-based, trust-driven approach to AI. It balances innovation with protection, ensuring that harmful practices are banned, high-risk applications are tightly regulated, and transparency is guaranteed for general-purpose AI.  For businesses, compliance is not optional. Those that act early will not only avoid penalties but also position themselves as trusted leaders in responsible AI. 

Compliance with GDPR: A Complete Guide for International Companies

Compliance with GDPR

The General Data Protection Regulation (GDPR) is one of the most comprehensive privacy laws in the world, setting strict standards for how personal data is collected, stored, and processed. For international companies, understanding what GDPR stands for, its key requirements, and the consequences of non-compliance is essential. This guide explains the meaning of GDPR, its global application, and the practical steps businesses can take to maintain compliance with GDPR while protecting customer trust. What is GDPR? (GDPR Meaning) The GDPR is a comprehensive EU data protection law that sets strict guidelines on how organizations collect, process, store, and share personal data. GDPR intends to give individuals greater control over their personal information while ensuring that businesses are transparent and accountable in their data practices. What does GDPR stand for? The full form of GDPR is General Data Protection Regulation. Does GDPR apply worldwide? Although it is an EU GDPR regulation, it applies to both EU-based companies and international companies outside the EU if they handle the personal data of EU residents. By setting clear rules, the regulation aims to strengthen GDPR data protection and safeguard privacy rights in today’s digital economy. What does it mean to be GDPR Compliant? Compliance with GDPR refers to an organization’s ability to meet all the requirements set out under the EU GDPR for collecting, storing, and processing personal data. Being GDPR compliant means ensuring transparency, security, and accountability in every stage of data handling. This includes obtaining valid user consent where necessary, implementing strong security measures, and responding promptly to data subject requests. Achieving GDPR compliance is not a one-time task, it requires continuous legal, technical, and organizational efforts to maintain GDPR data protection standards and adapt to evolving privacy risks. Who Needs to Ensure Compliance with GDPR? The EU GDPR applies to any organization, regardless of location, that collects or processes the personal data of individuals in the European Union. Its extraterritorial scope means that even businesses outside the EU must follow GDPR compliance rules if they offer goods or services to EU residents or monitor their online activities. Under the regulation, there are three key roles: Data Controllers – Organizations or individuals who determine the purpose and method of processing personal data. They hold primary responsibility for ensuring full compliance with GDPR in all data-handling activities. Data Processors – Third parties that process personal data on behalf of a controller. They are also required to be GDPR compliant and must implement robust technical and organisational safeguards. Data Subjects – Individuals whose personal information is collected and processed. The GDPR is designed to protect their rights, including access, correction, deletion, and objection to data use. Importantly, GDPR compliance obligations apply regardless of company size. What matters is not where your business is based, but how you collect, store, and manage personal data. Key GDPR Compliance Requirements The EU GDPR sets strict rules to protect individual privacy and promote transparency in how personal data is handled. Failing to meet these GDPR compliance obligations can lead to substantial fines and serious reputational harm. Below are the core requirements businesses, both in and outside the EU, must follow to remain GDPR compliant. 7 Data Protection Principles The regulation is based on seven key principles that guide responsible data management: Lawfulness, fairness, and transparency – Personal data must be collected legally, and individuals must be informed about its use. Purpose limitation – Data can only be used for clearly defined, legitimate purposes. Data minimization – Only collect the data necessary to fulfil the stated purpose. Accuracy – Keep personal data up to date and correct inaccuracies promptly. Storage limitation – Retain data only for as long as needed. Integrity and confidentiality – Use strong security measures to prevent breaches or unauthorized access. Accountability – Be able to demonstrate compliance with GDPR at all times. Legal Bases for Processing Before processing personal data, businesses must establish a lawful basis, such as: Consent – Individuals clearly agree to the processing of their data. Contractual necessity – Data is required to fulfil a contract. Legal obligation – Compliance with laws such as tax or employment regulations. Vital interests – Protecting someone’s life in urgent situations. Public interest – Performing official or governmental tasks. Legitimate interests – Activities like fraud prevention or security, provided they don’t override individual rights. Data Subject Rights GDPR grants individuals significant control over their personal data, including the right to: Access their information Correct inaccuracies Request deletion (“right to be forgotten”) Restrict or object to processing Transfer data to another provider Organizations must have processes in place to respond promptly to these requests. Documentation and Accountability Businesses must maintain detailed records of: Data types processed Processing purposes Data storage locations Retention periods Security measures in place Clear documentation supports transparency and helps demonstrate GDPR compliance during audits or investigations. Data Protection Officer (DPO) Requirement Organizations that process large-scale or sensitive personal data may be required to appoint a Data Protection Officer. The DPO oversees compliance efforts, advises on policies, and serves as a contact point for both regulators and data subjects. Even when not mandatory, having a DPO can greatly improve data protection governance. How to Stay Compliant with GDPR Staying GDPR compliant requires more than a one-time checklist, it’s an ongoing process of monitoring, updating, and improving your data protection practices. Below are key steps to help organizations meet EU GDPR requirements and maintain strong GDPR data protection standards. Understand the GDPR Principles Familiarize yourself with the seven core principles of the General Data Protection Regulation, lawfulness, fairness, transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity, confidentiality, and accountability. Aligning business processes with these principles is the foundation of compliance with GDPR. Conduct a Data Audit Identify all personal data you collect, process, and store—whether it belongs to customers, employees, or third parties. Record where it is stored, how long it is retained, and who has access to it. This will help you manage risks and identify gaps in compliance. Document and Record All

The CSRD Explained: A Practical Guide for International Businesses in EU

CSRD

The CSRD is reshaping the way businesses report on sustainability and ESG performance in Europe. For international companies operating within or doing significant business in the EU, understanding CSRD is critical. With mandatory disclosure requirements, greater transparency standards, and harmonized rules, the CSRD aims to create a level playing field and ensure that sustainability reporting holds the same weight as financial reporting. This guide breaks down what the EU’s guidelines on reporting climate related information (CSRD), who it affects, what needs to be reported, and when. We also cover key definitions and thresholds, so international businesses can stay ahead of compliance. What is CSRD? (CSRD Meaning) The Corporate Sustainability Reporting Directive is a European Union regulation that expands and replaces the Non-Financial Reporting Directive (NFRD). It mandates detailed sustainability reporting by companies operating in the EU, including some non-EU firms. It is essentially the European Climate Reporting Standard. The CSRD was adopted in December 2022 as part of the European Green Deal and the broader EU Sustainable Finance Agenda. Its main objective is to improve the quality, consistency, and comparability of sustainability reporting across companies and industries. This new framework requires companies to publish an annual CSRD report as part of their management report, following the European Sustainability Reporting Standards (ESRS) developed by EFRAG. Who Needs to Comply? (Reporting Threshold Pre Omnibus) As per the EU’s guidelines on reporting climate related information, the following companies need to publish sustainability reports under CSRD: Large EU companies meeting 2 of the 3 criteria below: Over 250 employees Net turnover of more than €50 million Total assets exceeding €25 million Listed SMEs, except micro-enterprises, with simplified standards and a transition period until 2028. Non-EU companies that: Have a net turnover of €150 million or more in the EU, And own at least one EU-based subsidiary or branch that meets CSRD thresholds. CSRD Timeline (Pre Omnibus) The CSRD timeline for publishing sustainability reports is as follows: Entity Type Reporting Begins Data Period Covered Large Companies under NFRD FY 2024 Reports due in 2025 Large EU Companies newly covered by CSRD FY 2025 Reports due in 2026 Listed SMEs FY 2026 Reports due in 2027 Non-EU Companies FY 2028 Reports due in 2029 Note: Voluntary adoption is encouraged before mandatory deadlines, especially to align with investor expectations and ESG-focused stakeholders. What Is CSRD Double Materiality Concept? Double materiality is a core concept in the CSRD Directive. It means that companies must assess and report on sustainability matters from two perspectives: Financial Materiality – How environmental, social, and governance (ESG) issues impact the company’s financial position and performance. Impact Materiality – How the company’s operations affect people and the environment.   Under CSRD, both angles must be evaluated to provide a full picture of sustainability performance. This goes beyond traditional financial reporting and helps stakeholders better understand a company’s risks and impacts. This requirement applies to all in-scope companies and is central to the ESRS standards. Does CSRD Require Scope 3 Emissions Reporting? The CSRD requires disclosure of Scope 1, 2, and 3 greenhouse gas (GHG) emissions under the European Sustainability Reporting Standards (ESRS E1). Scope 1: Direct emissions from owned or controlled sources (e.g., company vehicles). Scope 2: Indirect emissions from purchased electricity, heat, or steam. Scope 3: All other indirect emissions in the value chain (e.g., supplier emissions, employee commuting, product use).   For many companies, Scope 3 accounts for the largest share of total emissions, and the CSRD emphasizes transparency here. While data availability challenges are recognized, companies are expected to progressively improve their disclosures and estimation methods. What Are the ESRS (European Sustainability Reporting Standards)? To ensure consistency and clarity in sustainability reporting, companies must report in line with the European Sustainability Reporting Standards (ESRS). Developed by EFRAG (European Financial Reporting Advisory Group), these standards outline how and what companies must disclose regarding ESG (Environmental, Social, and Governance) issues. The ESRS are split into: 2 General Standards – providing foundational reporting principles and company-wide disclosures. 10 Topical Standards – grouped under Environment, Social, and Governance, focused on material risks, impacts, and opportunities. Companies are only required to report on data points that are material to them, based on their double materiality assessment. For example, a manufacturer may focus on ESRS E2 (Pollution), while a consumer-focused company may emphasize ESRS S4 (Consumers and End-Users). Note: Climate-related disclosures (ESRS E1) are presumed material by default. Companies must justify if deemed otherwise. ESRS 1: General Principles ESRS 2: General Disclosures Covers: Foundational sustainability principles and reporting concepts Covers: Governance, strategy, impacts, risks & opportunities, measurement, and objectives Environment Social Governance ESRS E1 – Climate Change ESRS S1 – Own Workforce ESRS G1 – Business Conduct ESRS E2 – Pollution ESRS S2 – Workers in the Value Chain ESRS E3 – Water and Marine Resources ESRS S3 – Affected Communities ESRS E4 – Biodiversity and Ecosystems ESRS S4 – Consumers and End-Users ESRS E5 – Resource Use and Circular Economy Reporting, Publication, and Assurance Format: Reports must be embedded in the annual management report and published in digital x HTML format on the company’s website, unifying financial and non-financial disclosures. Timeline: Annual publication, within 4 months of financial year-end. Audit: Reports must be audited by an independent party, starting with limited assurance and evolving to reasonable assurance by October 2028. SMEs: Simplified sector-specific standards are being developed for listed SMEs, with a two-year opt-out option.   These standards aim not only to increase transparency, but also to embed sustainability into the heart of corporate strategy. What is Gap Analysis? Following the identification of material issues for the company and their associated indicators, the next step is to outline the data collection process in preparation for audit phases. This includes describing the data to be gathered: the scope covered, calculation methodology, level of detail, validation methods, etc. This initial selection and description of data enable the execution of a “gap analysis,” which involves identifying the gap between the information currently available and the reporting requirements. The process

Reawave France Logo

140 bis Rue de Rennes, 75006 Paris, France

+33 (0)6 98 56 51 31

REAWAVE supports companies in their transformation projects with tailored advice to maximize their performance and growth.