Recent advancements in artificial intelligence (AI) are enabling organisations to enhance their efficiency by automating tasks and analysing large datasets. However, these advancements come with inherent risks.
With the recent implementation of the Artificial Intelligence Act (hereafter “AI Act”), businesses must now comply with strict new requirements.
Documentation Requirements
Similar to the GDPR, the AI Act introduces transparency measures and documentation obligations. Adopting a risk-based approach, the AI Act mandates that companies providing AI systems must prepare detailed technical documentation for each high-risk system or AI model in general use. This ensures transparency and traceability for both AI deployers and users.
This documentation must include information on the design, development, testing, and post-market monitoring of systems. Companies must maintain detailed records on the use and performance of AI systems and make this information available to competent authorities upon request. Additionally, companies must demonstrate compliance with ethical principles and fundamental rights, such as non-discrimination and data protection.
Companies are required to implement risk management procedures to identify, assess, and mitigate risks associated with AI systems. This includes regular assessments and updates of technical documentation based on new information or system changes. These procedures must be documented, along with the risk analyses conducted, to justify the company’s position in the event of an incident.
Challenges and Opportunities
Compliance with the AI Act presents significant challenges for many businesses, particularly SMEs and startups. However, it also offers opportunities to boost consumer confidence and enhance competitiveness in the European market. By adopting rigorous documentation practices, companies can avoid sanctions and position themselves as leaders in the responsible use of AI.
Our advice:
The AI Act complements the requirements of the GDPR rather than replacing them. Managing documentation requirements is crucial for navigating the AI regulatory landscape in Europe. Businesses need to establish robust documentation processes to meet the new requirements and capitalize on the opportunities presented by the AI Regulation.
If you have any questions about the AI Act’s documentation requirements in the context of your contracts or AI tools, or if you need support in defining your company’s strategy in this area, the Lexing team is here to help.
Register for our earlegal training course on 13 December 2024 “Framing your AI projects: contractual issues”! We will address the following questions:
- What are the main contractual aspects to consider at the AI development stage?
- What are the contractual aspects to be taken into account at the AI marketing stage?
- How to ensure proper management of AI-related incidents in the contractual chain
- How to contractually manage your compliance with the AI regulation?