BETA

42 Amendments of Peter POLLÁK related to 2021/0106(COD)

Amendment 64 #
Proposal for a regulation
Recital 3
(3) Artificial intelligence is a fast evolving family of technologies that can contribute to a wide array of economic and societal benefits across the entire spectrum of industries and social activities. By improving prediction, optimising operations and resource allocation, and personalising digital solutions available for individuals and organisations, the use of artificial intelligence can provide key competitive advantages to companies and support socially and environmentally beneficial outcomes, for example in healthcare, farming, education and training, media, mobility, infrastructure management, energy, transport and logistics, public services, security, justice, resource and energy efficiency, and climate change mitigation and adaptation.
2022/04/01
Committee: CULT
Amendment 68 #
Proposal for a regulation
Recital 4
(4) At the same time, depending on the circumstances regarding its specific application and use, artificial intelligence may generate risks and cause harm to public interests, private data and rights that are protected by Union law. Such harm might be material or immaterial.
2022/04/01
Committee: CULT
Amendment 70 #
Proposal for a regulation
Recital 5
(5) A Union legal framework laying down harmonised rules on artificial intelligence is therefore needed to foster the development, use and uptake of artificial intelligence in the internal market that at the same time meets a high level of protection of public interests, such as health and safety and the protection of fundamental rights, as recognised and protected by Union law. To achieve that objective, rules regulating the placing on the market and putting into service of certain AI systems should be laid down, thus ensuring the smooth functioning of the internal market and allowing those systems to benefit from the principle of free movement of goods and services. By laying down those rules, this Regulation supports the objective of the Union of being a global leader in the development of secure, trustworthy and ethical artificial intelligence, as stated by the European Council33 , and it ensures the protection of ethical principles, as specifically requested by the European Parliament34 with a human-centric approach and in compliance with freedom of expression, freedom of speech, media freedom, pluralism and cultural diversity. _________________ 33 European Council, Special meeting of the European Council (1 and 2 October 2020) – Conclusions, EUCO 13/20, 2020, p. 6. 34 European Parliament resolution of 20 October 2020 with recommendations to the Commission on a framework of ethical aspects of artificial intelligence, robotics and related technologies, 2020/2012(INL).
2022/04/01
Committee: CULT
Amendment 79 #
Proposal for a regulation
Recital 9
(9) For the purposes of this Regulation the notion of publicly accessible space should be understood as referring to any physical place that is accessible to the public, irrespective of whether the place in question is privately or publicly owned. Therefore, the notion does not cover places that are private in nature and normally not freely accessible for third parties, including law enforcement authorities, unless those parties have been specifically invited or authorised, such as homes, private clubs, offices, warehouses and factories. Online spaces are not covered either, as they are not physical spaces. However, the mere fact that certain conditions for accessing a particular space may apply, such as admission tickets or age restrictions, does not mean that the space is not publicly accessible within the meaning of this Regulation. Consequently, in addition to public spaces such as streets, relevant parts of government buildings and most transport infrastructure, spaces such as cinemas, theatres, shops, museums, monuments, cultural places, cultural institutions and shopping centres are normally also publicly accessible. Whether a given space is accessible to the public should however be determined on a case- by-case basis, having regard to the specificities of the individual situation at hand.
2022/04/01
Committee: CULT
Amendment 114 #
Proposal for a regulation
Recital 35
(35) AI systems used in education or vocational training, notably for determining access or assigning persons to educational and vocational training institutions or to evaluate persons on tests as part of or as a precondition for their education or for determining the course of study a student should follow should be considered high-risk, since they may determine the educational and professional course of a person’s life and therefore affect their ability to secure their livelihood. When improperly designed and used, such systems may violate the right to education and training as well as the right not to be discriminated against and perpetuate historical patterns of discrimination. AI systems used to monitor students’ behaviour and emotion during tests at education and training institutions should be considered high-risk, since they are also interfering with students’ rights to privacy and data protection. The use of AI to check fraud at test or exam, such as plagiarism, should not be consider as high-risk.
2022/04/01
Committee: CULT
Amendment 130 #
Proposal for a regulation
Recital 70
(70) Certain AI systems intended to interact with natural persons or to generate content may pose specific risks of impersonation or deception irrespective of whether they qualify as high-risk or not. In certain circumstances, the use of these systems should therefore be subject to specific transparency obligations without prejudice to the requirements and obligations for high-risk AI systems. In particular, natural persons should be notified that they are interacting with an AI system, unless this is obvious from the circumstances and the context of use or where the content is doubtless used to form part of a creative, artistic or fictional cinematographic work. Moreover, natural persons should be notified when they are exposed to an emotion recognition system or a biometric categorisation system. Such information and notifications should be provided in accessible formats for persons with disabilities or other vulnerabilities. Further, users, who use an AI system to generate or manipulate image, audio or video content that appreciably resembles existing persons, places or events and would falsely appear to a person to be authentic, should disclose in a clear manner that the content has been artificially created or manipulated by labelling the artificial intelligence output accordingly and disclosing its artificial origin.
2022/04/01
Committee: CULT
Amendment 177 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) the placing on the market, putting into service or use of an AI system that exploits any of the vulnerabilities of children or a specific group of persons due to their age, physical or mental disability, in order to materially distort the behaviour of a person pertaining to that group in a manner that causes or is likely to cause that person or another person physical or psychological harm;
2022/04/01
Committee: CULT
Amendment 239 #
Proposal for a regulation
Article 52 – paragraph 3 – introductory part
3. Users of an AI system that generates or manipulates image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and would falsely appear to a person to be authentic or truthful (‘deep fake’) , shall disclose in an appropriate clear, repetitive and visible manner that the content has been artificially generated or manipulated.
2022/04/01
Committee: CULT
Amendment 241 #
Proposal for a regulation
Article 52 – paragraph 3 – subparagraph 1
However, the first subparagraph shall not apply where the use is authorised by law to detect, prevent, investigate and prosecute criminal offences or where the content forms part of an evidently artistic, creative or fictional cinematographic and analogous work-or it is necessary for the exercise of the right to freedom of expression and the right to freedom of the arts and sciences guaranteed in the Charter of Fundamental Rights of the EU, and subject to appropriate safeguards for the rights and freedoms of third parties.
2022/04/01
Committee: CULT
Amendment 413 #
Proposal for a regulation
Recital 14
(14) In order to introduce a proportionate and effective set of binding rules for AI systems, a clearly defined risk- based approach should be followed. That approach should tailor the type and content of such rules to the intensity and scope of the risks that AI systems can generate for individuals and society, rather than depend on the type of technology. It is therefore necessary to prohibit certain artificial intelligence practices, to lay down requirements for high-risk AI systems and obligations for the relevant operators, and to lay down transparency obligations for certain AI systems.
2022/06/13
Committee: IMCOLIBE
Amendment 441 #
Proposal for a regulation
Recital 17 a (new)
(17 a) AI systems used in law enforcement and criminal justice contexts based on predictive methods, profiling and risk assessment pose an unacceptable risk to fundamental rights and in particular to the right of non- discrimination, insofar as they contradict the fundamental right to be presumed innocent and are reflective of historical, systemic, institutional and societal discrimination and other discriminatory practices. These AI systems should therefore be prohibited;
2022/06/13
Committee: IMCOLIBE
Amendment 454 #
Proposal for a regulation
Recital 18
(18) The use of AI systems for ‘real- time’ remote biometric identification of natural persons in publicly accessible spaces for the purpose of law enforcement is considered particularly intrusive in the rights and freedoms of the concerned persons, to the extent that it may affect the private life of a large part of the population, evoke a feeling of constant surveillance and indirectly dissuade the exercise of the freedom of assembly and other fundamental rights. In addition, the immediacy of the impact and the limited opportunities for further checks or corrections in relation to the use of such systems operating in ‘real-time’ carry heightened risks for the rights and freedoms of the persons that are concerned by law enforcement activities.
2022/06/13
Committee: IMCOLIBE
Amendment 465 #
Proposal for a regulation
Recital 19
(19) The use of those systems for the purpose of law enforcement should therefore be prohibited, except in three exhaustively listed and narrowly defined situations, where the use is strictly necessary to achieve a substantial public interest, the importance of which outweighs the risks. Those situations involve the search for potential victims of crime, including missing children; certain threats to the life or physical safety of natural persons or of a terrorist attack; and the detection, localisation, identification or prosecution of perpetrators or suspects of the criminal offences referred to in Council Framework Decision 2002/584/JHA38 if those criminal offences are punishable in the Member State concerned by a custodial sentence or a detention order for a maximum period of at least three years and as they are defined in the law of that Member State. Such threshold for the custodial sentence or detention order in accordance with national law contributes to ensure that the offence should be serious enough to potentially justify the use of ‘real-time’ remote biometric identification systems. Moreover, of the 32 criminal offences listed in the Council Framework Decision 2002/584/JHA, some are in practice likely to be more relevant than others, in that the recourse to ‘real-time’ remote biometric identification will foreseeably be necessary and proportionate to highly varying degrees for the practical pursuit of the detection, localisation, identification or prosecution of a perpetrator or suspect of the different criminal offences listed and having regard to the likely differences in the seriousness, probability and scale of the harm or possible negative consequences. _________________ 38 Council Framework Decision 2002/584/JHA of 13 June 2002 on the European arrest warrant and the surrender procedures between Member States (OJ L 190, 18.7.2002, p. 1).deleted
2022/06/13
Committee: IMCOLIBE
Amendment 474 #
Proposal for a regulation
Recital 20
(20) In order to ensure that those systems are used in a responsible and proportionate manner, it is also important to establish that, in each of those three exhaustively listed and narrowly defined situations, certain elements should be taken into account, in particular as regards the nature of the situation giving rise to the request and the consequences of the use for the rights and freedoms of all persons concerned and the safeguards and conditions provided for with the use. In addition, the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement should be subject to appropriate limits in time and space, having regard in particular to the evidence or indications regarding the threats, the victims or perpetrator. The reference database of persons should be appropriate for each use case in each of the three situations mentioned above.deleted
2022/06/13
Committee: IMCOLIBE
Amendment 487 #
(21) Each use of a ‘real-time’ remote biometric identification system in publicly accessible spaces for the purpose of law enforcement should be subject to an express and specific authorisation by a judicial authority or by an independent administrative authority of a Member State. Such authorisation should in principle be obtained prior to the use, except in duly justified situations of urgency, that is, situations where the need to use the systems in question is such as to make it effectively and objectively impossible to obtain an authorisation before commencing the use. In such situations of urgency, the use should be restricted to the absolute minimum necessary and be subject to appropriate safeguards and conditions, as determined in national law and specified in the context of each individual urgent use case by the law enforcement authority itself. In addition, the law enforcement authority should in such situations seek to obtain an authorisation as soon as possible, whilst providing the reasons for not having been able to request it earlier.deleted
2022/06/13
Committee: IMCOLIBE
Amendment 495 #
Proposal for a regulation
Recital 22
(22) Furthermore, it is appropriate to provide, within the exhaustive framework set by this Regulation that such use in the territory of a Member State in accordance with this Regulation should only be possible where and in as far as the Member State in question has decided to expressly provide for the possibility to authorise such use in its detailed rules of national law. Consequently, Member States remain free under this Regulation not to provide for such a possibility at all or to only provide for such a possibility in respect of some of the objectives capable of justifying authorised use identified in this Regulation.deleted
2022/06/13
Committee: IMCOLIBE
Amendment 498 #
Proposal for a regulation
Recital 23
(23) The use of AI systems for ‘real- time’ remote biometric identification of natural persons in publicly accessible spaces for the purpose of law enforcement necessarily involves the processing of biometric data. The rules of this Regulation that prohibit, subject to certain exceptions, such use, which are based on Article 16 TFEU, should apply as lex specialis in respect of the rules on the processing of biometric data contained in Article 10 of Directive (EU) 2016/680, thus regulating such use and the processing of biometric data involved in an exhaustive manner. Therefore, such use and processing should only be possible in as far as it is compatible with the framework set by this Regulation, without there being scope, outside that framework, for the competent authorities, where they act for purpose of law enforcement, to use such systems and process such data in connection thereto on the grounds listed in Article 10 of Directive (EU) 2016/680. In this context, this Regulation is not intended to provide the legal basis for the processing of personal data under Article 8 of Directive 2016/680. However, the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for purposes other than law enforcement, including by competent authorities, should not be covered by the specific framework regarding such use for the purpose of law enforcement set by this Regulation. Such use for purposes other than law enforcement should therefore not be subject to the requirement of an authorisation under this Regulation and the applicable detailed rules of national law that may give effect to it.deleted
2022/06/13
Committee: IMCOLIBE
Amendment 508 #
Proposal for a regulation
Recital 24
(24) Any processing of biometric data and other personal data involved in the use of AI systems for biometric identification, other than in connection to the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement as regulated by this Regulation, including where those systems are used by competent authorities in publicly accessible spaces for other purposes than law enforcement, should continue to comply with all requirements resulting from Article 9(1) of Regulation (EU) 2016/679, Article 10(1) of Regulation (EU) 2018/1725 and Article 10 of Directive (EU) 2016/680, as applicable.deleted
2022/06/13
Committee: IMCOLIBE
Amendment 592 #
Proposal for a regulation
Recital 39 a (new)
(39 a) The use of AI systems in migration, asylum and border control management should in no circumstances be used by Member States or European Union institutions as a means to circumvent their international obligations under the Convention of 28 July 1951 relating to the Status of Refugees as amended by the Protocol of 31 January 1967, nor should they be used to in any way infringe on the principle of non- refoulement, or deny safe and effective legal avenues into the territory of the Union, including the right to international protection;
2022/06/13
Committee: IMCOLIBE
Amendment 1239 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – introductory part
(d) the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement, unless and in as far as such use is strictly necessary for one of the following objectives:;
2022/06/13
Committee: IMCOLIBE
Amendment 1250 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – point i
(i) the targeted search for specific potential victims of crime, including missing children;deleted
2022/06/13
Committee: IMCOLIBE
Amendment 1261 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – point ii
(ii) the prevention of a specific, substantial and imminent threat to the life or physical safety of natural persons or of a terrorist attack;deleted
2022/06/13
Committee: IMCOLIBE
Amendment 1269 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – point iii
(iii) the detection, localisation, identification or prosecution of a perpetrator or suspect of a criminal offence referred to in Article 2(2) of Council Framework Decision 2002/584/JHA62 and punishable in the Member State concerned by a custodial sentence or a detention order for a maximum period of at least three years, as determined by the law of that Member State. _________________ 62 Council Framework Decision 2002/584/JHA of 13 June 2002 on the European arrest warrant and the surrender procedures between Member States (OJ L 190, 18.7.2002, p. 1).deleted
2022/06/13
Committee: IMCOLIBE
Amendment 1290 #
Proposal for a regulation
Article 5 – paragraph 1 – point d a (new)
(d a) The use of predictive, profiling and risk assessment AI systems in law enforcement and criminal justice;
2022/06/13
Committee: IMCOLIBE
Amendment 1292 #
Proposal for a regulation
Article 5 – paragraph 1 – point d b (new)
(d b) The use of predictive, profiling and risk assessment AI system by or on behalf of competent authorities in migration, asylum or border control management, to profile an individual or assess a risk, including a security risk, a risk of irregular immigration, or a health risk, posed by a natural person who intends to enter or has entered the territory of a Member State, on the basis of personal or sensitive data, known or predicted, except for the sole purpose of identifying specific care and support needs;
2022/06/13
Committee: IMCOLIBE
Amendment 1303 #
Proposal for a regulation
Article 5 – paragraph 1 – point d c (new)
(d c) the placing on the market, putting into service, or use of AI systems by law enforcement authorities or by competent authorities in migration, asylum and border control management, such as polygraphs and similar tools to detect deception, trustworthiness or related characteristics;
2022/06/13
Committee: IMCOLIBE
Amendment 1308 #
Proposal for a regulation
Article 5 – paragraph 1 – point d d (new)
(d d) the use of AI systems by or on behalf of competent authorities in migration, asylum and border control management, to forecast or predict individual or collective movement for the purpose of, or in any way reasonably foreseeably leading to, the interdicting, curtailing or preventing migration or border crossings;
2022/06/13
Committee: IMCOLIBE
Amendment 1353 #
Proposal for a regulation
Article 5 – paragraph 2
2. The use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement for any of the objectives referred to in paragraph 1 point d) shall take into account the following elements: (a) the nature of the situation giving rise to the possible use, in particular the seriousness, probability and scale of the harm caused in the absence of the use of the system; (b) the consequences of the use of the system for the rights and freedoms of all persons concerned, in particular the seriousness, probability and scale of those consequences. In addition, the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement for any of the objectives referred to in paragraph 1 point d) shall comply with necessary and proportionate safeguards and conditions in relation to the use, in particular as regards the temporal, geographic and personal limitations.deleted
2022/06/13
Committee: IMCOLIBE
Amendment 1371 #
Proposal for a regulation
Article 5 – paragraph 3
3. As regards paragraphs 1, point (d) and 2, each individual use for the purpose of law enforcement of a ‘real-time’ remote biometric identification system in publicly accessible spaces shall be subject to a prior authorisation granted by a judicial authority or by an independent administrative authority of the Member State in which the use is to take place, issued upon a reasoned request and in accordance with the detailed rules of national law referred to in paragraph 4. However, in a duly justified situation of urgency, the use of the system may be commenced without an authorisation and the authorisation may be requested only during or after the use. The competent judicial or administrative authority shall only grant the authorisation where it is satisfied, based on objective evidence or clear indications presented to it, that the use of the ‘real- time’ remote biometric identification system at issue is necessary for and proportionate to achieving one of the objectives specified in paragraph 1, point (d), as identified in the request. In deciding on the request, the competent judicial or administrative authority shall take into account the elements referred to in paragraph 2.deleted
2022/06/13
Committee: IMCOLIBE
Amendment 1384 #
Proposal for a regulation
Article 5 – paragraph 4
4. A Member State may decide to provide for the possibility to fully or partially authorise the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement within the limits and under the conditions listed in paragraphs 1, point (d), 2 and 3. That Member State shall lay down in its national law the necessary detailed rules for the request, issuance and exercise of, as well as supervision relating to, the authorisations referred to in paragraph 3. Those rules shall also specify in respect of which of the objectives listed in paragraph 1, point (d), including which of the criminal offences referred to in point (iii) thereof, the competent authorities may be authorised to use those systems for the purpose of law enforcement.deleted
2022/06/13
Committee: IMCOLIBE
Amendment 1405 #
Proposal for a regulation
Article 5 a (new)
Article 5 a Amendments to Article 5 The Commission is empowered to adopt delegated acts in accordance with Article 73 to update the list of AI systems and practices prohibited under Article 5 of the present regulation, according to the latest development in technology and to the assessment of increased or newly emerged risks to fundamental rights.
2022/06/13
Committee: IMCOLIBE
Amendment 1468 #
Proposal for a regulation
Article 7 – paragraph 1 – introductory part
1. The Commission is empowered to adopt delegated acts in accordance with Article 73 to update the list in Annex III by adding new area headings and high-risk AI systems where both of the following conditions are fulfilled:
2022/06/13
Committee: IMCOLIBE
Amendment 1476 #
Proposal for a regulation
Article 7 – paragraph 1 – point a
(a) the AI systems are intended to be used in any of the areas listed in points 1 to 8 of Annex III or in the newly identified area headings;
2022/06/13
Committee: IMCOLIBE
Amendment 1909 #
Proposal for a regulation
Article 16 a (new)
Article 16 a Obligations of users of high-risk AI systems Users of high-risk AI systems shall conduct and publish a fundamental rights impact assessment, detailing specific information relating to the context of use of the high-risk AI system in question, including: (a) the affected persons, (b) intended purpose, (c) geographic and temporal scope, (d) assessment of the legality and fundamental rights impacts of the system, (e) compatibility with accessibility legislation, (f) potential direct and indirect impact on fundamental rights, (g) any specific risk of harm likely to impact marginalised persons or those at risk of discrimination, (h) the foreseeable impact of the use of the system on the environment, (i) any other negative impact on the public interest, (j) clear steps as to how the harms identified will be mitigated and how effective this mitigation is likely to be.
2022/06/13
Committee: IMCOLIBE
Amendment 2287 #
Proposal for a regulation
Title IV a (new)
Rights of affected persons Article 52 a 1.Natural persons have the right not to be subject to non-compliant AI systems.The placing on the market, putting into service or use of non-compliant AI system gives rise to the right of the affected natural persons subject to such non-compliant AI systems to seek and receive redress. 2.Natural persons have the right to be informed about the use and functioning of AI systems they have been or may be exposed to, particularly in the case of high-risk and other regulated AI systems, according to Article 52. 3.Natural persons and public interest organisations have the right to lodge a complaint before the relevant national supervisory authorities against a producer or user of non-compliant AI systems where they consider that their rights or the rights of the natural persons they represent under the present regulation have been violated, and have the right receive effective remedy.
2022/06/13
Committee: IMCOLIBE
Amendment 2986 #
Proposal for a regulation
Article 84 – paragraph 6
6. In carrying out the evaluations and reviews referred to in paragraphs 1 to 4 the Commission shall take into account the positions and findings of the Board, of the European Parliament, of the Council, and of other relevant bodies or sources, including stakeholders, and in particular civil society.
2022/06/13
Committee: IMCOLIBE
Amendment 2993 #
Proposal for a regulation
Article 84 – paragraph 7
7. The Commission shall, if necessary, submit appropriate proposals to amend this Regulation, in particular taking into account developments in technology and new potential or realised risks to fundamental rights, and in the light of the state of progress in the information society.
2022/06/13
Committee: IMCOLIBE
Amendment 3203 #
Proposal for a regulation
Annex III – paragraph 1 – point 7 – point b
(b) AI systems intended to be used by competent public authorities or by third parties acting on their behalf to assess a risk, including but not limited to a security risk, a risk of irregular immigration, or a health risk, posed by a natural person who intends to enter or has entered into the territory of a Member State;
2022/06/13
Committee: IMCOLIBE
Amendment 3211 #
Proposal for a regulation
Annex III – paragraph 1 – point 7 – point d
(d) AI systems intended to assist competent public authorities for the examination and assessment of the veracity of evidence and claims in relation tof applications for asylum, visa and residence permits and associated complaints with regard to the eligibility of the natural persons applying for a status.
2022/06/13
Committee: IMCOLIBE
Amendment 3220 #
Proposal for a regulation
Annex III – paragraph 1 – point 7 – point d a (new)
(d a) AI systems intended to be used by or on behalf of competent authorities in migration, asylum and border control management for the forecasting or prediction of trends related to migration, movement and border crossings;
2022/06/13
Committee: IMCOLIBE
Amendment 3224 #
Proposal for a regulation
Annex III – paragraph 1 – point 7 – point d b (new)
(d b) AI systems that are or may be used by or on behalf of competent authorities in law enforcement, migration, asylum and border control management for the biometric identification of natural persons;
2022/06/13
Committee: IMCOLIBE
Amendment 3226 #
Proposal for a regulation
Annex III – paragraph 1 – point 7 – point d c (new)
(d c) AI systems intended to be used by or on behalf of competent authorities in migration, asylum and border control management to monitor, surveil or process data in the context of border management activities for the purpose of recognising or detecting objects and natural persons;
2022/06/13
Committee: IMCOLIBE