As explained above, artificial intelligence can execute tasks that demand independent decision-making as effectively as any human would perform the given task. An accident involving artificial intelligence is an incident where the AI has played a role in the chain of events that led to the accident. But while the use of AI is slowly entering physical products as well as administrative decision-making processes, concerns about AI’s uncontrolled actions involving risks have emerged.
New EU regulations and liability regime for AI and emerging tech
During 2022, the European Commission will seek to establish regulations around artificial intelligence and emerging technologies to help further clarify liability. This will happen on two tracks. Within the Coordinated Plan on AI, the Commission will propose measures adapting the liability framework to the challenges of new technologies, including AI.
In addition, prepared by the European Commission’s Directorate-General for the Internal Market, Industry, Entrepreneurship & SMEs (DG GROW), it will also propose revisions of the General Product Liability Directive which concerns the liability for injuries and property owned by consumers caused by any kind of physical products. Of course, liability leads to compensation only after something has occurred. From the Risk Management perspective, the Commission is proposing to also revise product safety legislation like the Machinery Directive and the General Product Safety Directive to take new technologies into account.
What kind of reforms can we expect for liability rules?
The Expert Group on Liabilities and New Technologies established by the Commission released a report, Liability for artificial intelligence and other emerging digital technologies1) in 2019, highlighting possible considerations to allocate the liability. Currently, the producer or importer (to ETA) of the product is liable for injuries and physical damage caused due to safety defects. But with new technologies, the blame could be put also on the providers of software, network or IoT based connected systems, users of the new technologies or, indeed, producers of AI technology used in the systems. The expert group suggested the following solutions:
- A person operating a permissible technology that nevertheless carries an increased risk of harm to others, for example AI-driven robots in public spaces, should be subject to strict liability for damage resulting from its operation.
- In situations where a service provider ensuring the necessary technical framework has a higher degree of control than the owner or user of an actual product or service equipped with AI, this should be considered in determining who primarily operates the technology.
- A person using a technology which has a certain degree of autonomy should not be less accountable for ensuing harm than if said harm had been caused by a human auxiliary.
- Manufacturers of products or digital content incorporating emerging digital technology should be liable for damage caused by defects in their products, even if the defect was caused by changes made to the product under the producer’s control after it had been placed on the market.
- For situations exposing third parties to an increased risk of harm, compulsory liability insurance could give victims better access to compensation and protect potential tortfeasors against the risk of liability.
- Where a particular technology increases the difficulties of proving the existence of an element of liability beyond what can be reasonably expected, victims should be entitled to facilitation of proof.
- Emerging digital technologies should come with logging features, where appropriate in the circumstances, and failure to log, or to provide reasonable access to logged data, should result in a reversal of the burden of proof in order to not be to the detriment of the victim.
- The destruction of the victim’s data should be regarded as damage, compensable under specific conditions.
- It is not necessary to give devices or autonomous systems a legal personality, as the harm these may cause can and should be attributable to existing persons or bodies.