27 Amendments of Alexis GEORGOULIS related to 2021/0106(COD)
Amendment 67 #
Proposal for a regulation
Recital 4
Recital 4
(4) At the same time, depending on the circumstances regarding its specific application and use, artificial intelligence may generate risks and cause harm to public interests and rightsfundamental rights of people in employment and in learning and of socially active people that are protected by Union law. Such harm might be material or immaterial.
Amendment 71 #
(5) A Union legal framework laying down harmonised rules on artificial intelligence is therefore needed to foster the development, use and uptake of artificial intelligence in the internal market that at the same time meets a high level of protection of public interests, such as health and safety and the protection of fundamental rights, as recognised and protected by Union law. To achieve that objective, rules regulating the placing on the market and putting into service of certain AI systems should be laid down, thus ensuring the smooth functioning of the internal market and allowing those systems to benefit from the principle of free movement of goods and services. By laying down those rules, this Regulation supports the objective of the Union of being a global leader in the development of secure, and trustworthy and ethical artificial intelligence based on fundamental rights, as stated by the European Council33, and it ensures the protection of ethical principles, as specifically requested by the European Parliament34. _________________ 33 European Council, Special meeting of the European Council (1 and 2 October 2020) – Conclusions, EUCO 13/20, 2020, p. 6. 34 European Parliament resolution of 20 October 2020 with recommendations to the Commission on a framework of ethical aspects of artificial intelligence, robotics and related technologies, 2020/2012(INL).
Amendment 81 #
Proposal for a regulation
Recital 9
Recital 9
(9) For the purposes of this Regulation the notion of publicly accessible space should be understood as referring to any physical or virtual place that is accessible to the public, irrespective of whether the place in question is privately or publicly owned or owned on a non-profit basis. Therefore, the notion does not cover places that are private in nature and normally not freely accessible for third parties, including law enforcement authorities, unless those parties have been specifically invited or authorised, such as homes, private clubs, offices, warehouses and factories. Online spaces are not covered either, as they are not physicalvirtual protected spaces. However, the mere fact that certain conditions for accessing a particular space may apply, such as admission tickets or age restrictions, does not mean that the space is not publicly accessible within the meaning of this Regulation. Consequently, in addition to public spaces such as streets, parks, sports complexes, relevant parts of government buildings and most transport infrastructure, spaces such as cinemas, theatres, museums, libraries, shops and shopping centres are normally also publicly accessible. Whether a given space is accessible to the public should however be determined on a case-by-case basis, having regard to the specificities of the individual situation at hand as regards the use made of that space.
Amendment 88 #
Proposal for a regulation
Recital 15
Recital 15
(15) Aside from the many beneficial uses of artificial intelligence, that technology can also be misused and provide novel and powerful tools for manipulative, exploitative and social control practices. Such practices are particularly harmful and should be prohibited because they contradict Union values of respect for human dignity, freedom, equality, democracy and the rule of law and Union fundamental rights, including the right to non-discrimination, employee protection, data protection and privacy and the rights of the child.
Amendment 91 #
Proposal for a regulation
Recital 17
Recital 17
(17) AI systems providing social scoring of natural persons for general purpose by public authorities or on their behalf may lead to discriminatory outcomes and the exclusion of certain groups. They may violate the right to dignity and non- discrimination and the values of equality and justice. Such AI systems evaluate or classify the trustworthiness, interests and abilities of natural persons based on their social behaviour in multiple contexts or known or predicted personal or, personality characteristics. The social score obtained from such AI systems may lead to the detrimental or unfavourable treatment of natural persons or whole groups thereof in social contexts, whior identity ch are unrelated to the context in which the data was originally generated or collected or to a detrimental treatment that is disproportionate or unjustified to the gravity of their social behaviouracteristics. Such AI systems should be therefore prohibited.
Amendment 95 #
Proposal for a regulation
Recital 18
Recital 18
(18) The use of AI systems for ‘real- time’ remote biometric identification of natural persons in publicly accessible spaces for the purpose of law enforcement is considered particularly intrusive improperly encroaches on the rights and freedoms of the concerned persons, to the extent that it may affectpersons and is detrimental to the private life of a large part of the population, evoke a feeling of in that it makes constant surveillance possible and, indirectly dissuade the so doing, makes it difficult to exercise of the freedom of assembly and other fundamental rights. In addition, the immediacy of the impact and the limited opportunities for further checks or corrections in relation to the use of such systems operating in ‘real-time’ carry heightened risks for the rights and freedoms of the persons that are concerned by law enforcement activities.
Amendment 96 #
Proposal for a regulation
Recital 19
Recital 19
(19) The use of those systems for the purpose of law enforcement should therefore be prohibited, except in three exhaustively listed and narrowly defined situations, where the use is strictly necessary to achieve a substantial public interest, the importance of which outweighs the risks. Those situations involve the search for potential victims of crime, including missing children; certain threats to the life or physical safety of natural persons or of a terrorist attack; and the detection, localisation, identification or prosecution of perpetrators or suspects of the criminal offences referred to in Council Framework Decision 2002/584/JHA38if those criminal offences are punishable in the Member State concerned by a custodial sentence or a detention order for a maximum period of at least three years and as they are defined in the law of that Member State. Such threshold for the custodial sentence or detention order in accordance with national law contributes to ensure that the offence should be serious enough to potentially justify the use of ‘real-time’ remote biometric identification systems. Moreover, of the 32 criminal offences listed in the Council Framework Decision 2002/584/JHA, some are in practice likely to be more relevant than others, in that the recourse to ‘real-time’ remote biometric identification will foreseeably be necessary and proportionate to highly varying degrees for the practical pursuit of the detection, localisation, identification or prosecution of a perpetrator or suspect of the different criminal offences listed and having regard to the likely differences in the seriousness, probability and scale of the harm or possible negative consequences. _________________ 38 Council Framework Decision 2002/584/JHA of 13 June 2002 on the European arrest warrant and the surrender procedures between Member States (OJ L 190, 18.7.2002, p. 1)in publicly accessible spaces should therefore be prohibited as a matter of principle.
Amendment 97 #
Proposal for a regulation
Recital 20
Recital 20
Amendment 98 #
Proposal for a regulation
Recital 21
Recital 21
Amendment 99 #
Proposal for a regulation
Recital 22
Recital 22
Amendment 100 #
Proposal for a regulation
Recital 23
Recital 23
(23) The use of AI systems for ‘real- time’ remote biometric identification of natural persons in publicly accessible spaces for the purpose of law enforcement necessarily involvesresults in the processing of biometric data. The rules of this Regulation that prohibit, subject to certain exceptions, such use, which are based on Article 16 TFEU, should apply as lex specialis in respect of the rules on the processing of biometric data contained in Article 10 of Directive (EU) 2016/680, thus regulating such use and the processing of biometric data involved in an exhaustive manner. Therefore, such use and processing should only be possible in as far as it is compatible with the framework set by this Regulation, without there being scope, outside that framework, for the competent authorities, where they act for purpose of law enforcement, to use such systems and process such data in connection thereto on the grounds listed in Article 10 of Directive (EU) 2016/680. In this context, this Regulation is not intended to provide the legal basis for the processing of personal data under Article 8 of Directive 2016/680. However, the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for purposes other than law enforcement, including by competent authorities, should not be covered by the specific framework regarding such use for the purpose of law enforcement set by this Regulation. Such use for purposes other than law enforcement should therefore not be subject to the requirement of an authorisation under this Regulation and the applicable detailed rules of national law that may give effect to it.
Amendment 102 #
Amendment 105 #
Proposal for a regulation
Recital 27
Recital 27
(27) High-risk AI systems should only be placed on the Union market or put into service if they comply with certain mandatory requirements. Those requirements should ensure that high-risk AI systems available in the Union or whose output is otherwise used in the Union do not pose unacceptable risks to important Union public interests as recognised and protected by Union law. AI systems identified as high-risk should be limited to those that have a significant harmful impact on the health, safety and fundamental rights of persons in the Union and such limitation minimises any potential restriction to international trade, if any.
Amendment 110 #
Proposal for a regulation
Recital 33
Recital 33
(33) Technical inaccuracies of AI systems intended for the remote biometric identification of natural persons in protected spaces can lead to biased results and entail discriminatory effects. This is particularly relevant when it comes to age, ethnicity, sex or disabilities. Therefore, ‘real-time’ and ‘post’ remote biometric identification systems used in the workplace, in higher education or within vocational training should be classified as high-risk. In view of the risks that they pose, both types of remote biometric identification systems should be subject to specific requirements on logging capabilities and human oversight.
Amendment 119 #
Proposal for a regulation
Recital 36
Recital 36
(36) AI systems used in employment, employment support, workers management and access to self- employment, notably for the recruitment and selection of persons, for making decisions on promotion and termination and for task allocation, monitoring or evaluation of persons in work-related contractual relationships, should also be classified as high-risk, since those systems may appreciably impact future career prospects and livelihoods of these persons. Relevant work-related contractual relationships should involve employees and persons providing services through platforms as referred to in the Commission Work Programme 2021. Such persons should in principle not be considered users within the meaning of this Regulation. Throughout the recruitment process and in the evaluation, promotion, or retention of persons in work-related contractual relationships, such systems may perpetuate historical patterns of discrimination, for example against women, certain age groups, persons with disabilities, or persons of certain racial or ethnic origins or sexual orientation. AI systems used to monitor the performance and behaviour of these persons may also impact their rights to data protection and privacy.
Amendment 140 #
Proposal for a regulation
Recital 85
Recital 85
(85) In order to ensure that the regulatory framework can be adapted where necessary, the power to adopt acts in accordance with Article 290 TFEU should be delegated to the Commission to amend the techniques and approaches referred to in Annex I to define AI systems, the Union harmonisation legislation listed in Annex II, the high-risk AI systems listed in Annex III, the provisions regarding technical documentation listed in Annex IV, the content of the EU declaration of conformity in Annex V, the provisions regarding the conformity assessment procedures in Annex VI and VII and the provisions establishing the high-risk AI systems to which the conformity assessment procedure based on assessment of the quality management system and assessment of the technical documentation should apply. It is of particular importance that the Commission carry out appropriate consultations during its preparatory work, including at expert level, and that those consultatthe level of experts from different areas of society such as education, media and culture and from trade unions beand conductedsumer and data protection organisations, in accordance with the principles laid down in the Interinstitutional Agreement of 13 April 2016 on Better Law-Making58. In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States’ experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts. _________________ 58 OJ L 123, 12.5.2016, p. 1.
Amendment 160 #
Proposal for a regulation
Article 3 – paragraph 1 – point 39
Article 3 – paragraph 1 – point 39
(39) ‘publicly accessible space’ means any physical or virtual place accessible to the public, regardless of whether certain conditions for access may apply and regardless of form of ownership;
Amendment 163 #
Proposal for a regulation
Article 3 – paragraph 1 – point 44 a (new)
Article 3 – paragraph 1 – point 44 a (new)
(44a) ‘cultural institutions’ means institutions such as libraries, museums, theatres, concert halls, exhibition centres, architectural ensembles and multi- purpose arts venues, as well as their virtual sections, which organise cultural education, democratic exchanges and research and provide ways and means of engaging with cultural heritage;
Amendment 181 #
Proposal for a regulation
Article 5 – paragraph 1 – point c – introductory part
Article 5 – paragraph 1 – point c – introductory part
(c) the placing on the market, putting into service or use of AI systems by public authorities or on their behalf for the evaluation or classification of the trustworthiness of natural persons over a certain period of time based on their social behaviour or known or predicted personal or personality characteristics, with the social score leading to either or both of the following:
Amendment 186 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – introductory part
Article 5 – paragraph 1 – point d – introductory part
(d) the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement, unless and in as far as such use is strictly necessary for one of the following objectives:;
Amendment 188 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – point i
Article 5 – paragraph 1 – point d – point i
Amendment 189 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – point ii
Article 5 – paragraph 1 – point d – point ii
Amendment 190 #
Proposal for a regulation
Article 5 – paragraph 1 – point d – point iii
Article 5 – paragraph 1 – point d – point iii
Amendment 194 #
Proposal for a regulation
Article 5 – paragraph 2
Article 5 – paragraph 2
Amendment 197 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
Amendment 199 #
Proposal for a regulation
Article 5 – paragraph 4
Article 5 – paragraph 4
Amendment 267 #
Proposal for a regulation
Annex III – paragraph 1 – point 4 – introductory part
Annex III – paragraph 1 – point 4 – introductory part
4. Employment and employment support, workers management and access to self-employment: