BETA

44 Amendments of Marion WALSMANN related to 2020/2014(INL)

Amendment 1 #
Draft opinion
Recital A
A. whereas Artificial Intelligence (AI) playsemerging digital technologies, such as Artificial Intelligence (AI), the Internet of Things and of Services (IoT/IoS) or robotics, play and will continue to play an increasing role in our everyday lives and hasve the potential to contribute to the development of innovations in many sectors and offer benefits for consumers through innovative products and services and, for businesses, through optimised performance;
2020/05/27
Committee: IMCO
Amendment 5 #
Draft opinion
Recital A a (new)
Aa. whereas these emerging digital technologies are transforming the characteristics of many products and services, requiring in turn a clear safety and liability framework, ensuring both consumer protection and legal certainty for businesses;
2020/05/27
Committee: IMCO
Amendment 8 #
Draft opinion
Recital A b (new)
Ab. whereas the Union's existing safety and liability framework might need to be adapted, as highlighted by the Commission's Report on the safety and liability implications for Artificial Intelligence, the Internet of Things and robotics;
2020/05/27
Committee: IMCO
Amendment 9 #
Draft opinion
Recital A c (new)
Ac. whereas product safety and product liability are two complementary mechanisms pursuing the same policy goal of a functioning single market for goods and services, and this Opinion suggests possible adjustments to the Union liability frameworks in light of the increased importance of emerging digital technologies;
2020/05/27
Committee: IMCO
Amendment 15 #
Draft opinion
Recital C
C. whereas robust liability mechanisms remedying damage contribute to better protection of consumers, creation of trust in new technologies integrated in products and acceptance for innovationitizens and consumers from harm, creation of trust in emerging digital technologies while ensuring legal certainty for businesses and enabling them to innovate;
2020/05/27
Committee: IMCO
Amendment 21 #
Draft opinion
Paragraph 1
1. Welcomes the Commission’s aim, which is to make the Union legal framework fit the new technological developments, ensuring a high level of protection for consumers from harm caused by new technologies while maintaining the balance with the needs of technological innovation;deleted
2020/05/27
Committee: IMCO
Amendment 25 #
Draft opinion
Paragraph 1 a (new)
1a. Emphasises that the Product Liability Directive was adopted in 1985 and was revised in 1999 and since then products evolved a lot, therefore the Product Liability Directive is not fit for purpose anymore and needs to be updated;
2020/05/27
Committee: IMCO
Amendment 26 #
Draft opinion
Paragraph 2
2. StressesPoints out the need to assess to what extent thedapt the Union's existing liability framework, and in particular the Council Directive 85/374/EEC1 (the Product Liability Directive), needs to be updated in order to guarantee effective consumer protection and - PLD), to the digital world; calls on the Commission to revise the PLD, by addressing the challenges posed by emerging digital technologies such as artificial intelligence, the Internet of things (IoT) or robotics, thereby ensuring effective citizen and consumer protection from harm as well as legal clariertainty for businesses, while avoiding high costs and risks especially for small and medium enterprises and start- ups; __________________ 1 Council Directive 85/374/EEC of 25 July 1985 on the approximation of the laws, regulations and administrative provisions of the Member States concerning liability for defective products (OJ L 210, 7.8.1985, p. 29).
2020/05/27
Committee: IMCO
Amendment 41 #
Draft opinion
Paragraph 4
4. Calls on the Commission to assess whether definitions and concepts in the product liability framework need to be updated due to the specific characteristics of AI apprevise the product liability framework by taking into account the specific challenges of digitalicsations for liability law such as complexity, autonomy and opacconnectivity, openness, autonomy, opacity (un)predictability, data- drivenness and vulnerability;
2020/05/27
Committee: IMCO
Amendment 45 #
Motion for a resolution
Paragraph 2
2. Firmly believes that in order to efficiently exploit the advantages and prevent potential misuses, principle-based and future-proof legislation across the EU for all AI-systems is crucial; is of the opinion that, while sector specific regulations for the broad range of possible applications are preferable, a horizontal legal framework based on common principles seems necessary to establish equal standards across the Union and, effectively protect our European values and ensure legal clarity, with the legal framework being limited to filling existing legal gaps;
2020/05/28
Committee: JURI
Amendment 51 #
Draft opinion
Paragraph 5
5. Urges the Commission to scrutinise whether it is necessary to include software in the definition of ‘products’ under the Product Liability Directive and asks the Commission to update concepts such as ‘producer’, ‘damage’ and ‘defect’, and if so, to what extent; asks the Commission to also examine whether the product liability framework needs to be revised in order to protect injured parties efficiently as regards products that are purchased as a bundle with related services;
2020/05/27
Committee: IMCO
Amendment 52 #
Draft opinion
Paragraph 5 a (new)
5a. Stresses that the Product Liability Directive considers the moment when products are put into circulation as the decisive moment for the producers liability and that for AI systems the producer retains to some degree control after the product has been put into circulation, therefore asks the Commission to update this concept in its revision of Product Liability Directive;
2020/05/27
Committee: IMCO
Amendment 55 #
Draft opinion
Paragraph 5 a (new)
5a. Asks the Commission to consider the liability of online marketplaces by qualifying them as 'supplier' under the Product Liability Directive;
2020/05/27
Committee: IMCO
Amendment 64 #
Draft opinion
Paragraph 7
7. Calls on the Commission to evaluate whether and to what extent the burden of proof should be reversedconsider adapting the rules governing the burden of proof for harms caused by emerging digital technologies, in order to empower harmed consumers while preventing abuse and providing legal clariertainty for businesses;
2020/05/27
Committee: IMCO
Amendment 71 #
Motion for a resolution
Paragraph 7
7. Considers that the Product Liability Directive (PLD) has proven to be an effective means of getting compensation for harm triggered by a defective product; hence, notes that it should also be used with regard toshould be reviewed, and that this review should include adjustments of the definitions of product and defect in order to allow civil liability claims against the producer of a defective AI-system, when the AI-system qualifies as a product under that Directive; if; legislative adjustments to the PLD are necessary, they should be discussed during a review of that Directive; is of the opinion that, for the purpose of legal certainty throughout the Union, the ‘backend operator’ should fall under the same liability rules as the producer, manufacturer and developer;
2020/05/28
Committee: JURI
Amendment 81 #
Draft opinion
Paragraph 9
9. Asks the Commission to carefully assess the advantages and disadvantages of introducing a strict liability model for products containing AI applicationintroduction of a separate yet complementary strict liability regime for AI systems presenting a high risk to cause harm or damage to one or more persons in a manner that is rand consider it only in specific high risk areas; underlines the need to strictly respect the proportionality principle if this approach is retainedom and impossible to predict in advance, taking into account its likely impact on the protection of citizens and consumers from harm, the capacity of businesses - particularly SMEs - to innovate, the coherence of the Union's safety and liability framework and on the principles of subsidiarity and proportionality.
2020/05/27
Committee: IMCO
Amendment 85 #
Motion for a resolution
Paragraph 10
10. Opines that liability rules involving the deployer should in principle cover all operations of AI-systems, no matter where the operation takes place and whether it happens physically or virtually; remarks that operations in public spaces that expose many third persons to a risk constitute, however, cases that require further consideration; considers that the potential victims of harm or damage are often not aware of the operation and regularly do not have contractual liability claims against the deployer; notes that when harm or damage materialises, such third persons would then only have a fault-liability claim, and they might find it difficult to prove the fault of the deployer of the AI-system and thus corresponding liability claims might fail;
2020/05/28
Committee: JURI
Amendment 114 #
Motion for a resolution
Paragraph 15
15. Recommends that all high-risk AI- systems be listed in an Annex to the proposed Regulation; recognises that, given the rapid technological change and the required technical expertise, it should be up to the Commission to review that Annex every six months and if necessary, amend it through a delegated actrecognises that, given the rapid technological change and the required technical expertise, the Commission should draw up a list of high-risk AI systems through a delegated act, in respect of which the European Parliament may raise its objections; believes that the Commission should closely cooperate with a newly formed standing committee similar to the existing Standing Committee on Precursors or the Technical Committee on Motor Vehicles, which include national experts of the Member States and stakeholders; considers that the balanced membership of the ‘High-Level Expert Group on Artificial Intelligence’ could serve as an example for the formation of the group of stakeholders;
2020/05/28
Committee: JURI
Amendment 119 #
Motion for a resolution
Paragraph 16
16. Believes that the proposed Regulation should set the limitation period and, in line with strict liability systems of the Member States, the proposed Regulation should only cover harm to the important legally protected rights such as life, health, physical integrity and property, and should set out the amounts and extent of in this context the Commission should examine whether the scompensation as well as the limitation period should be extended to include economic damage;
2020/05/28
Committee: JURI
Amendment 137 #
Motion for a resolution
Paragraph 19
19. Is of the opinion that, based on the significant potential to cause harm and by taking Directive 2009/103/EC7 into account, all deployers of high-risk AI- systems listed in the Annex to the proposed Regulation should hold liability insurance; considers that such a mandatory insurance regime for high-risk AI-systems should cover the amounts and the extent of compensation laid down by the proposed Regulation; _________________ 7 OJ L 263, 7.10.2009, p. 11.
2020/05/28
Committee: JURI
Amendment 150 #
Motion for a resolution
Annex I – part A – paragraph 1 – indent 3
- There should be no over-regulation and more red tape must be prevented, as this would hamper European innovation in AI, especially if the technology, product or service is developed by SMEs or start- ups.
2020/05/28
Committee: JURI
Amendment 156 #
Motion for a resolution
Annex I – part A – paragraph 1 – indent 5
- This Report and the Product Liability Directive are two pillars of a common liability framework for AI- systems and require close coordination and alignment between all political actors.
2020/05/28
Committee: JURI
Amendment 158 #
Motion for a resolution
Annex I – part A – paragraph 1 – indent 6
- Citizens need to be entitled to the same level of protection and rights, no matter if the harm is caused by an AI- system or not, or if it takes place physically or virtually so that their confidence in this new technology is strengthened.
2020/05/28
Committee: JURI
Amendment 192 #
Motion for a resolution
Annex I – part B – recital 7
(7) Council Directive 85/374/EEC3 (the Product Liability Directive) has proven to be an effective means of getting compensation for damage triggered by a defective product. Hence, it shouldowever, so that it can also be used with regard to civil liability claims of a party who suffers harm or damage against the producer of a defective AI- system. In line with the better regulation principles of the Union, any necessary, legislative adjustments should be discussed during a review of that Directive are necessary. The existing fault-based liability law of the Member States also offers in most cases a sufficient level of protection for persons that suffer harm or damages caused by an interfering third person, as that interference regularly constitutes a fault-based action. Consequently, this Regulation should focus on claims against the deployer of an AI- system. _________________ 3 Council Directive 85/374/EEC of 25 July 1985 on the approximation of the laws, regulations and administrative provisions of the Member States concerning liability for defective products, OJ L 210, 7.8.1985, p. 29.
2020/05/28
Committee: JURI
Amendment 212 #
Motion for a resolution
Annex I – part B – recital 12
(12) All AI-systems with a high risk should be listed in an Annex to this Regulation. Given the rapid technical and market developments as well as the technical expertise which is required for an adequate review of AI-systems, tThe power to adopt delegated acts in accordance with Article 290 of the Treaty on the Functioning of the European Union should be delegated to the Commission tso amend this Regulation in respecthat it can draw up a list of the types of AI-systems that pose a high risk and the critical sectors where they are used. Based on the definitions and provisions laid down in this Regulation, the Commission should review the Annex every six months and, if necessary, amend it by means of delegated acts. To give businesses enough planning and investment security, subsequent changes to the critical sectors should only be made every 12 months. Developers are called upon to notify the Commission if they are currently working on a new technology, product or service that falls under one of the existing critical sectors provided for in the Annex and which later could qualify for a high risk AI-system.
2020/05/28
Committee: JURI
Amendment 225 #
Motion for a resolution
Annex I – part B – recital 14
(14) In line with strict liability systems of the Member States, this Regulation should cover only harm or damage to life, health, physical integrity and property. For the same reason, it should determine the amount and extent of compensation, as well as the limitation period for bringing forward liability claims. In contrast to the Product Liability Directive, this Regulation should set out a significantly lower ceiling for compensation, as it only refers to a single operation of an AI-system, while the former refers to a number of products or even a product line with the same defect.
2020/05/28
Committee: JURI
Amendment 245 #
Motion for a resolution
Annex I – part B – recital 20
(20) Despite missing historical claim data, there are already insurance products that are developed area-by-area and cover- by-cover as technology develops. Many insurers specialise in certain market segments (e.g. SMEs) or in providing cover for certain product types (e.g. electrical goods), which means that there will usually be an insurance product available for the insured. If a new type of insurance is needed, the insurance market will develop and offer a fitting solution and thus, will close the insurance gap. In exceptional cases, in which the compensation significantly exceeds the maximum amounts set out in this Regulation, Member States should be encouraged to set up a special compensation fund for a limited period of time that addresses the specific needs of those cases.
2020/05/28
Committee: JURI
Amendment 270 #
Motion for a resolution
Annex I – part B – Article 3 – point a
(a) ‘AI-system’ means a system that displays intelligent behaviour by analysing certain input and taking action, with some degree of autonomy, to achieve specific goals.artificial intelligence’ means a system that includes methods and procedures that enable technical systems to perceive their environment, process what is perceived and solve problems independently, make decisions, act and learn from the consequences of those decisions and actions; AI-systems can be purely software-based, acting in the virtual world, or can be embedded in hardware devices;
2020/05/28
Committee: JURI
Amendment 278 #
Motion for a resolution
Annex I – part B – Article 3 – point c
(c) ‘high risk’ means a significant potential in an autonomously operating AI- system to cause significant harm or damage to one or more persons in a manner that is random and impossible to predict in advance; the significance of the potential depends on the interplay between the severity of possible harm or damage, the likelihood that the risk materializes and the manner in which the AI-system is being used;
2020/05/28
Committee: JURI
Amendment 280 #
Motion for a resolution
Annex I – part B – Article 3 – point d
(d) ‘deployer’ means theany natural or legal person who decides on the specific use of the AI-system, exercises control over the associated risk and benefits from its operation;
2020/05/28
Committee: JURI
Amendment 297 #
Motion for a resolution
Annex I – part B – Article 3 – point g a (new)
(ga) ‘force majeure’ means, in accordance with national rules, exceptional and unforeseeable circumstances beyond the control of the deployer, the consequences of which could not have been avoided even if all due care had been exercised.
2020/05/28
Committee: JURI
Amendment 306 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2 – introductory part
2. The Commission shall be tasked with drawing up a list of high-risk AI- systems as well as the critical sectors where they are used shall be listed in the Annex to this Regulation. The Commission is empowered to adopt delegated acts in accordance with Article 13, to amend the exhaustive list in the Annex, by:, by means of delegated acts in accordance with Article 13.
2020/05/28
Committee: JURI
Amendment 308 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2 – point a
(a) including new types of high-risk AI-systems and critical sectors in which they are deploydeleted;
2020/05/28
Committee: JURI
Amendment 311 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2 – point b
(b) deleting types of AI-systems that can no longer be considered to pose a high risk; and/ored
2020/05/28
Committee: JURI
Amendment 314 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2 – point c
(c) changing the critical sectors for existing high-risk AI-systems.deleted
2020/05/28
Committee: JURI
Amendment 317 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2 – subparagraph 2
Any delegated act amending the Annex shall come into force six months after its adoption. When determining new critical sectors and/or high-risk AI-systems to be inserted by means of delegated acts in the Annex, the Commission shall take full account of the criteria set out in this Regulation, in particular those set out in Article 3(c).
2020/05/28
Committee: JURI
Amendment 327 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 4
4. The deployer of a high-risk AI- system shall ensure they have liability insurance cover that is adequate in relation to the amounts and extent of compensation provided for in Article 5 and 6 of this Regulation. If compulsory insurance regimes already in force pursuant to other Union or national law are considered to cover the operation of the AI-system, the obligation to take out insurance for the AI- system pursuant to this Regulation shall be deemed fulfilled, as long as the relevant existing compulsory insurance covers the amounts and the extent of compensation provided for in Articles 5 and 6 of this Regulation.
2020/05/28
Committee: JURI
Amendment 337 #
Motion for a resolution
Annex I – part B – Article 5 – paragraph 1 – introductory part
1. AThe level of compensation to be paid by a deployer of a high-risk AI- system that has been held liable for harm or damage under this Regulation shall compensate:be determined in accordance with the relevant national provisions.
2020/05/28
Committee: JURI
Amendment 342 #
Motion for a resolution
Annex I – part B – Article 5 – paragraph 1 – point a
(a) up to a maximum total amount of EUR ten million in the event of death or of harm caused to the health or physical integrity of one or several persons as the result of the same operation of the same high-risk AI-system;deleted
2020/05/28
Committee: JURI
Amendment 347 #
Motion for a resolution
Annex I – part B – Article 5 – paragraph 1 – point b
(b) up to a maximum total amount of EUR two million in the event of damage caused to property, including when several items of property of one or several persons were damaged as a result of the same operation of the same high-risk AI- system; where the affected person also holds a contractual liability claim against the deployer, no compensation shall be paid under this Regulation if the total amount of the damage to property is of a value that falls below EUR 500.deleted
2020/05/28
Committee: JURI
Amendment 352 #
Motion for a resolution
Annex I – part B – Article 5 – paragraph 1 – point 2
2. Where the combined compensation to be paid to several persons who suffer harm or damage caused by the same operation of the same high-risk AI-system exceeds the maximum total amounts provided for in paragraph 1, the amounts to be paid to each person shall be reduced pro-rata so that the combined compensation does not exceed the maximum amounts set out in paragraph 1.deleted
2020/05/28
Committee: JURI
Amendment 357 #
Motion for a resolution
Annex I – part B – Article 6
Extent of compensation 1. Within the amount set out in Article 5(1)(a), compensation to be paid by the deployer held liable in the event of physical harm followed by the death of the affected person, shall be calculated based on the costs of medical treatment that the affected person underwent prior to his or her death, and of the pecuniary prejudice sustained prior to death caused by the cessation or reduction of the earning capacity or the increase in his or her needs for the duration of the harm prior to death. The deployer held liable shall furthermore reimburse the funeral costs for the deceased affected person to the party who is responsible for defraying those expenses. If at the time of the incident that caused the harm leading to his or her death, the affected person was in a relationship with a third party and had a legal obligation to support that third party, the deployer held liable shall indemnify the third party by paying maintenance to the extent to which the affected person would have been obliged to pay, for the period corresponding to an average life expectancy for a person of his or her age and general description. The deployer shall also indemnify the third party if, at the time of the incident that caused the death, the third party had been conceived but had not yet been born. 2. Within the amount set out in Article 5(1)(b), compensation to be paid by the deployer held liable in the event of harm to the health or the physical integrity of the affected person shall include the reimbursement of the costs of the related medical treatment as well as the payment for any pecuniary prejudice sustained by the affected person, as a result of the temporary suspension, reduction or permanent cessation of his or her earning capacity or the consequent, medically certified increase in his or her needs.Article 6 deleted
2020/05/28
Committee: JURI
Amendment 424 #
By 1 January 202X [5 years after the date of application of this Regulation], and every three years thereafter, the Commission shall present to the European Parliament, the Council and the European Economic and Social Committee a detailed report reviewing this Regulation in the light of the further development of Artificial Intelligence. In the context of this report, the Commission shall examine, inter alia, whether the scope of this Regulation should be extended to include economic damage.
2020/05/28
Committee: JURI
Amendment 429 #
Motion for a resolution
Annex I – part B – Annex
Exhaustive list of AI-systems that pose a high risk as well as of critical sectors where the AI-systems are being deployed1 AI-systems Critical sector [...] _________________ 1 *This Annex should aim to replicate the level of detail that appears for instance in Annex I of Regulation 2018/858 (Approval and market surveillance of motor vehicles and their trailers, and of systems, components and separate technical units intended for such vehicle).deleted
2020/05/28
Committee: JURI