12 Amendments of Javier ZARZALEJOS related to 2020/2018(INL)
Amendment 2 #
Draft opinion
Paragraph -1 a (new)
Paragraph -1 a (new)
-1 a. Stresses that the Digital Services Act should provide a level-playing field by offering sufficient legal clarity regarding the concepts and definitions included in the legislation and by applying to all relevant actors offering digital services in the Union, regardless of whether they are established inside or outside the Union;
Amendment 4 #
Draft opinion
Paragraph 1
Paragraph 1
1. Underlines that digital services and their underlying algorithms need to fully respect fundamental rights, especially the protection of privacy and personal data, non-discrimination, the rights of the child, and the freedom of speech and information, as enshrined in the Treaties and the Charter of Fundamental rights of the European Union; calls therefore on the Commission to implement an obligation of transparency, user-friendliness and explainabilitytion in layman’s terms for consumers of algorithms, and the possibility of human intervention, as well as other measures, such as independent audits and specific stress tests to assist and enforce compliance;
Amendment 11 #
Draft opinion
Paragraph 1 a (new)
Paragraph 1 a (new)
1 a. Considers that illegal content online should be treated in the same way as illegal content offline, while fully respecting fundamental rights; points out that illegal content online does not only undermine citizens' trust in the digital environment but may also have grave and long-lasting consequences for internal security and fundamental rights, especially of children; underlines that the swift and consistent detection and removal or blocking of illegal content online continues to be an urgent challenge as national approaches towards the removal or blocking of illegal content online lack sufficient harmonisation; acknowledges that a differentiation has to be made between the various types of illegal content online as some content, notably child sexual exploitation material, is manifestly illegal while the nature of other types of content might depend on the applicable national law or can only be ascertained through contextualisation;
Amendment 14 #
Draft opinion
Paragraph 1 b (new)
Paragraph 1 b (new)
1 b. Stresses that exclusive reliance on notice-and-take-down measures and voluntary action by online intermediaries is not sufficient to effectively address illegal content online; believes that the responsibility of online intermediaries to tackle manifestly illegal content on their platforms and the infrastructure they provide should be considerably increased, while taking into account their scale of reach and operational capacities; underlines the importance to complement this responsibility with effective remedies to be made available to content providers whose content was removed; emphasises, in this regard, that an explicit legal basis for proactive measures to detect and remove or block known illegal content is needed in addition to clear rules on duty of care to ensure compliance with the requirements of Regulation (EU) 2016/679 of the European Parliament and of the Council (GDPR)1c; _________________ 1cRegulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
Amendment 16 #
Draft opinion
Paragraph 1 c (new)
Paragraph 1 c (new)
1 c. Highlights that illegal content online can easily be multiplied and its negative impact amplified within a very short period of time; reminds that Facebook alone blocked 1.2 million copies of the video of the March 2019 Christchurch attacks at the point of upload and removed another 300,000 copies within 24 hours of the attack, which would not have been possible if each individual removal or blocking decision had been subject to human verification; believes, therefore, that online intermediaries should be expressly allowed to have recourse to automated tools to detect and remove or block access to content whose illegality has either been established by a court or whose illegal nature can be easily determined without contextualisation; stresses, however, that there should be no general monitoring obligation;
Amendment 17 #
Draft opinion
Paragraph 1 d (new)
Paragraph 1 d (new)
1 d. Reiterates, moreover, that cooperation between online intermediaries and competent national authorities should be improved to ensure the swift blocking or removal of content flagged by competent authorities and the successful investigation and prosecution of illegal content providers; underlines the importance to harmonise the rules and procedures across the Union in relation to content removals or blockings following notifications by law enforcement, including appropriate time limits for responses to legitimate removal requests and sanctions for systematic failure to respond to such requests;
Amendment 18 #
Draft opinion
Paragraph 1 e (new)
Paragraph 1 e (new)
1 e. Stresses the need to ensure that removal or blocking decisions are accurate, well-founded and respect fundamental rights; reiterates that access to judicial redress should available to content providers to satisfy the right to effective remedy; highlights, in this regard, that transparency obligations should be imposed on online intermediaries as regards the criteria applied to decide on removals or blockings and the technology used to guarantee the application of necessary safeguards and to avoid discrimination and unnecessary removals or blockings; believes furthermore that more transparency is required, both on the side of online platforms and law enforcement authorities, regarding the types of content removed and the reasons therefor;
Amendment 19 #
Draft opinion
Paragraph 2
Paragraph 2
2. Emphasises that the rapid development of digital services requires strong and future-proof legislation tohat protects privacy and a, provides reasonable duty of care to ensure digital dignity and effectively addresses illegal content; stresses therefore in this regard that all digital service providers need to fully respect Union data protection law, namely Regulation (EU) 2016/679 of the European Parliament and of the Council (GDPR)1 and Directive (EC) 2002/58 of the European Parliament and of the Council (ePrivacy)2 , currently under revision, and othe freedom of expressionr Union legislation which includes obligations upon them with the aim to balance the right of users to freedom of expression with the right to liberty and security; _________________ 1Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1). 2 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (OJ L 201, 31.7.2002, p. 37).
Amendment 45 #
Draft opinion
Paragraph 3
Paragraph 3
3. Recommends the Commission to work on harmonising the national personalelectronic identification sign-insystem with a view to creating a single Union sign-in system in order toelectronic identification system for EU citizens which ensures the protection of personal data and age verification, especially for children, and makes public services more accessible to everyone; considers that such a system should be secure and only process the data that is necessary for the identification of the user;
Amendment 59 #
Draft opinion
Paragraph 5
Paragraph 5
5. Notes the potential negative impact ofat behavioural advertising, including micro-targeted advertising, and of assessment of individuals, may better address potential needs than contextual advertising but can also have negative impacts, especially on minors and other vulnerable groups, by interfering in the private life of individuals, posing questions as to the collection and use of the data used to target said advertising, offering products or services or, setting prices; calls therefore on the Commission to introduce, or influencing democratic processes and elections; calls therefore on the Commission to introduce specific requirements with regard to behavioural advertising to protect fundamental rights, including a limitation on micro-targeted advertisements, especially on vulnerable groups, and a prohibition on the use of discriminatory practices for the provision of services or products.;
Amendment 69 #
Draft opinion
Paragraph 5 a (new)
Paragraph 5 a (new)
5 a. is concerned about the fragmentation of public oversight and supervision of digital services and the frequent lack of financial and human resources for the oversight bodies needed to properly fulfil their tasks; calls for increased cooperation with regard to regulatory oversight of digital services; supports the creation of an independent EU body to ensure harmonised implementation of and compliance with applicable rules;
Amendment 74 #
Draft opinion
Paragraph 5 b (new)
Paragraph 5 b (new)
5 b. highlights the importance of user empowerment with regard to the enforcement of their own fundamental rights online; considers that users should be provided with easy access to complaint procedures, legal remedies, educational measures and awareness-raising on data protection issues;