BETA

13 Amendments of Katarina BARLEY related to 2020/2016(INI)

Amendment 31 #
Motion for a resolution
Recital B a (new)
B a. whereas the use of AI technology should be developed in such a way as to put people at its center and therefore to be worth of public trust;
2020/07/20
Committee: LIBE
Amendment 36 #
Motion for a resolution
Recital C a (new)
C a. whereas AI systems always have to be in the service of humans and have the ultimate safety valve of being designed so that they can always be shut down by a human operator;
2020/07/20
Committee: LIBE
Amendment 43 #
Motion for a resolution
Recital E
E. whereas AI applications may offer great opportunities in the field of law enforcement, in particular in improving the working methods of law enforcement agencies and judicial authorities, and combating certain types of crime more efficiently, in particular financial crime, money laundering and terrorist financing, as well as certain types of cybercrime; while at the same time entailing significant risks for the fundamental rights of people;
2020/07/20
Committee: LIBE
Amendment 52 #
Motion for a resolution
Recital F b (new)
F b. whereas the relationship between protecting fundamental rights and effective policing must always be an essential element in the discussions on whether and how AI should be used by law enforcement sector, where decisions may have long lasting consequences on the life and freedom of individuals;
2020/07/20
Committee: LIBE
Amendment 58 #
Motion for a resolution
Recital G
G. whereas AI applications in use by law enforcement include applications such as facial recognition technologies, automated number plate recognition, speaker identification, speech identification, lip-reading technologies, aural surveillance (i.e. gunshot detection algorithms), autonomous research and analysis of identified databases, forecasting (predictive policing and crime hotspot analytics), behaviour detection tools, autonomous tools to identify financial fraud and terrorist financing, social media monitoring (scraping and data harvesting for mining connections), international mobile subscriber identity (IMSI) catchers, and automated surveillance systems incorporating different detection capabilities (such as heartbeat detection and thermal cameras); whereas the aforementioned applications have vastly varying degrees of reliability and accuracy as well as potentially significant effects on the protection of fundamental rights;
2020/07/20
Committee: LIBE
Amendment 69 #
Motion for a resolution
Recital I
I. whereas use of AI in law enforcement entails a number of phigh risks for the protenctial riskon of fundamental rights of individuals, such as opaque decision- making, different types of discrimination, and risks to the protection of privacy and personal data, the protection of freedom of expression and information, and the presumption of innocence;
2020/07/20
Committee: LIBE
Amendment 102 #
Motion for a resolution
Paragraph 3
3. Considers, in this regard, that any AI tool either developed or used by law enforcement or judiciary should, as a minimum, be safe, secure and fit for purpose, respect the principles of data minimisation, fairness, accountability, transparency and explainability, with their deploymentvelopment, deployment and use subject to a strict necessity and proportionality test;
2020/07/20
Committee: LIBE
Amendment 115 #
Motion for a resolution
Paragraph 4
4. Sees with great concern the potential of mass surveillance by means of AI technologies in the law enforcement sector; Highlights the importanceerative need of preventing such mass surveillance by means of AI technologies, and of banning any applications that would result in it;
2020/07/20
Committee: LIBE
Amendment 143 #
Motion for a resolution
Paragraph 8 a (new)
8 a. stresses that only a robust European AI governance enable the necessary operationalisation of fundamental rights principles;
2020/07/20
Committee: LIBE
Amendment 153 #
Motion for a resolution
Paragraph 10
10. Underlines that in judicial and law enforcement contexts, the final decision always needs to be taken by a human, who can be held accountable for the decisions made, and include the possibility of a recourse for a remedy; reminds that under EU law, automated individual decision making shall not be based on special categories of personal data (personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation), unless suitable measures to safeguard the data subject's rights and freedoms and legitimate interests are in place; highlights that EU law prohibits profiling that results in discrimination against natural persons on the basis of special categories of personal data;
2020/07/20
Committee: LIBE
Amendment 168 #
Motion for a resolution
Paragraph 12 a (new)
12 a. calls for clear and appropriate time limits to be established for the erasure of personal data or for a periodic review of the need for the storage of personal data processed or generated by AI technologies for law enforcement purposes;
2020/07/20
Committee: LIBE
Amendment 170 #
Motion for a resolution
Paragraph 13
13. CallsReminds that EU law (Directive (EU) 2016/680) already foresees a mandatory data protection impact assessment for any type of processing, in particular, using new technologies, that is likely to result in a high risk to the rights and freedoms of natural persons and is of the opinion that this is the case for all AI technologies in the area of law enforcement; Calls in addition for a compulsory fundamental rights impact assessment to be conducted prior to the implementation or deployment of any AI systems for law enforcement or judiciary, in order to assess any potential risks to fundamental rights;
2020/07/20
Committee: LIBE
Amendment 185 #
Motion for a resolution
Paragraph 15
15. Calls for a moratorium on the deployment of facial recognition systems for specific law enforcement operations, until the technical standards can be considered fully fundamental rights compliant, results derived are non- discriminatory, and there is public trust in the necessity and proportionality for the deployment of such technologies; calls for a ban of the use of facial recognition in the public sphere where not used in specific law enforcement operations;
2020/07/20
Committee: LIBE