BETA

Activities of Gilles LEBRETON related to 2020/2014(INL)

Legal basis opinions (0)

Amendments (12)

Amendment 5 #
Motion for a resolution
Citation 23 a (new)
- having regard to Council Directive 2000/78/EC of 27 November 2000 establishing a general framework for equal treatment in employment and occupation1a, ____________________ 1a OJ L 303, 2.12.2000, p. 16.
2020/05/28
Committee: JURI
Amendment 6 #
Motion for a resolution
Citation 23 b (new)
- having regard to the directives on equal treatment of men and women with regard to employment and access to goods and services,
2020/05/28
Committee: JURI
Amendment 7 #
Motion for a resolution
Citation 23 c (new)
- having regard to various consumer protection rules such as the Unfair Commercial Practices Directive (Directive 2005/29/EC) and the Consumer Rights Directive (Directive 2011/83/EC),
2020/05/28
Committee: JURI
Amendment 8 #
Motion for a resolution
Citation 23 d (new)
- having regard to Directive 2009/48/EC on the safety of toys,
2020/05/28
Committee: JURI
Amendment 9 #
Motion for a resolution
Citation 23 e (new)
- having regard to European Council Decision 2017/745 on medical devices amending Directive 2001/83/EC and applicable from 26 May 2020,
2020/05/28
Committee: JURI
Amendment 43 #
Motion for a resolution
Paragraph 1
1. Considers that the challenge related to the introduction of AI-systems into society and the economy is one of the most important questions on the current political agenda; whereas technologies based on A I could improve our lives in almost every sector, from the personal sphere (e.g. personalised education, fitness programs, credit provision and court orders) to global challenges (e.g. climate change, hunger and starvation);
2020/05/28
Committee: JURI
Amendment 89 #
Motion for a resolution
Paragraph 11
11. Considers it appropriate to define the deployer as the person who decides on the use of the AI-system, who exercises control over the risk and who benefits from its operation; considers that exercising control means any action of the deployer that affects the manner of the operation from start to finish or that changes specific functions or processes within the AI- system; takes the view that those tasked with deployment should monitor the good intentions of the developers throughout the value chain in order to ensure the protection of consumers through trustworthy AI;
2020/05/28
Committee: JURI
Amendment 112 #
Motion for a resolution
Paragraph 15
15. Recommends that all high-risk AI- systems be listed in an Annex to the proposed Regulation; recognises that, given the rapid technological change and the required technical expertise, it should be up to the Commission to review that Annex every six months and if necessary, amend it through a delegated act; believes that the Commission should closely cooperate with a newly formed standing committee similar to the existing Standing Committee on Precursors or the Technical Committee on Motor Vehicles, which include national experts of the Member States and stakeholders; considers that the balanced membership of the ‘High-Level Expert Group on Artificial Intelligence’ could serve as an example for the formation of the group of stakeholders, with the addition of ethics experts and anthropologists, sociologists and mental- health specialists;
2020/05/28
Committee: JURI
Amendment 157 #
Motion for a resolution
Annex I – part A – paragraph 1 – indent 6
- Citizens need to be entitled to the same level of protection and rights, no matter if the harm is caused by an AI- system or not, or if it takes place physically or virtually. , or if it is material or non- material. As set out in the Commission Communication of 19 February 2020 on the safety and liability implications of AI and robotics, ‘explicit obligations for producers, among others, of humanoid AI to explicitly take into account the non- material damage that their products could cause to users, in particular vulnerable users such as elderly people in care settings’ should be taken into account in this EU legislation.
2020/05/28
Committee: JURI
Amendment 164 #
Motion for a resolution
Annex I – part B – recital 1
(1) The concept of ‘liability’ plays an important double role in our daily life: on the one hand, it ensures that a person who has suffered harm or damage is entitled to claim compensation from the party proven to be liable for that harm or damage, and on the other hand, it provides the economic incentives for persons to avoid causing harm or damage, whether material or non- material, in the first place. Any liability framework should strive to strike a balance between efficiently protecting potential victims of damage and at the same time, providing enough leeway to make the development of new technologies, products or services possible.
2020/05/28
Committee: JURI
Amendment 170 #
Motion for a resolution
Annex I – part B – recital 3
(3) The rise of Artificial intelligence (AI) however presents a significant challenge for the existing liability frameworks. Using AI-systems in our daily life will lead to situations in which their opacity (“black box” element) makes it extremely expensive or even impossible to identify who was in control of the risk of using the AI-system in question or which code or input has caused the harmful operation. This difficulty is even compounded by the connectivity between an AI-system and other AI-systems and non-AI-systems, by its dependency on external data, by its vulnerability to cybersecurity breaches as well as by the increasing autonomy of AI-systems triggered by machine-learning and deep- learning capabilities. Besides these complex features and potential vulnerabilities, AI-systems could also be used to cause severe harm, such as compromising our human dignity, our values and freedoms by tracking individuals against their will, by introducing Social Credit Systems or by constructing lethal autonomous weapon systems, or to take biased decisions in matters of health insurance, credit provision, court orders or recruitment or employment decisions.
2020/05/28
Committee: JURI
Amendment 248 #
Motion for a resolution
Annex I – part B – recital 20
(20) Despite missing historical claim data for reasons such as updating algorithms or anonymising data, there are already insurance products that are developed area-by-area and cover- by-cover as technology develops. Many insurers specialise in certain market segments (e.g. SMEs) or in providing cover for certain product types (e.g. electrical goods), which means that there will usually be an insurance product available for the insured. If a new type of insurance is needed, the insurance market will develop and offer a fitting solution and thus, will close the insurance gap. In exceptional cases, in which the compensation significantly exceeds the maximum amounts set out in this Regulation, Member States should be encouraged to set up a special compensation fund for a limited period of time that addresses the specific needs of those cases.
2020/05/28
Committee: JURI