Publication
Generative AI: A global guide to key IP considerations
Artificial intelligence (AI) raises many intellectual property (IP) issues.
Global | Publication | julho 2024
On July 12, 2024, the Artificial Intelligence (AI) Regulation, referred to as the AI Act was published in the Official Journal of the European Union. Its official application date is 2 August 2026 but the AI Act also provides for different application dates:
This new regulation aims to establish harmonised rules to ensure that AI systems in the EU respect fundamental rights and ensure a high level of protection of health and safety, while also fostering investment and innovation in the field of AI.
The AI Act directly affects businesses operating within the EU, whether they are providers (i.e. those developing the systems), users (referred to as “deployers”), importers, distributors, or manufacturers of AI systems. The legislation provides clear definitions for the various actors involved in AI system or practices and holds them accountable for compliance with the new rules. This means that all stakeholders must ensure that their AI practices comply with the requirements outlined in the AI Act.
The AI Act also applies extraterritorially to companies not established in the EU. Providers must comply when placing AI systems or general purpose AI models on the market or putting them into service in the EU, regardless of where they are established. Similarly, importers, distributors, and manufacturers serving the EU market are also caught. Providers and deployers are also caught where the output of their AI systems is used in the EU, regardless of where they are located.
The regulatory framework defines four levels of risk for AI systems:
Meanwhile, providers of “general purpose AI models”, such as large language models, will need to meet requirements designed to allow providers and deployers incorporating them into AI systems to better understand their capabilities and limitations and to address other inherent issues such as potential infringements caused by their training (the latter point addressed through putting in place a policy to respect EU copyright law and a summary of the content used to train the model). General purpose AI models posing “systemic risk” must comply with additional obligations, such as documenting and disclosing significant incidents and mitigating such systemic risks.
The penalties for non-compliance with the AI Act are significant. They range from €750.000 to €35.000.000 or from 1 % to 7% of the company's global annual turnover, depending on the severity of the infringement. Therefore, it is crucial for companies to ensure that they fully understand the provisions of the AI Act and comply with its requirements to avoid such sanctions.
Companies must establish appropriate governance and monitoring measures to ensure that their AI systems adhere to the AI Act.
Companies must prepare and ensure that their AI practices comply with these new regulations. To initiate the process of compliance with the AI Act, companies should begin by compiling an inventory of their current AI systems and models. Organisations that do not yet have an inventory should assess their current status to understand their potential exposure. Even if they are not currently using AI, it is highly likely that this will change in the coming years. Initial identification can begin from an existing software/applications catalogue or, in its absence, through surveys conducted among various departments, in particular IT and risk departments.
Once the inventory is established, organisations should:
Subscribe and stay up to date with the latest legal news, information and events . . .
© Norton Rose Fulbright LLP 2023