32 Amendments of Javier ZARZALEJOS related to 2020/2019(INL)
Amendment 5 #
Motion for a resolution
Citation 7 b (new)
Citation 7 b (new)
- having regard to the Recommendation of the Commission of 1 March 2018 on measures to effectively tackle illegal content online (C(2018) 1177),
Amendment 9 #
Motion for a resolution
Citation 7 f (new)
Citation 7 f (new)
- having regard to the Directive (EU) 2017/541/EU of the European Parliament and of the Council of 15 March 2017 on combating terrorism,
Amendment 22 #
Motion for a resolution
Recital B a (new)
Recital B a (new)
Ba. whereas digital services are used by the majority of Europeans on a daily basis, but are subject to an increasingly wide set of rules across the EU leading to significant fragmentation on the market and consequently legal uncertainty for European users and services operating cross-borders, combined with lack of regulatory control on key aspects of today's information environment;
Amendment 54 #
Motion for a resolution
Recital H
Recital H
Amendment 60 #
Motion for a resolution
Recital H a (new)
Recital H a (new)
Ha. whereas automated content removal mechanisms of digital service providers should be proportionate, covering only those justified cases, where the benefits of removing content outweigh the potential disadvantages of keeping content online; whereas these procedures should be also transparent and their terms and conditions should be made known prior to the users would use the service;
Amendment 93 #
Motion for a resolution
Paragraph 2 a (new)
Paragraph 2 a (new)
2a. Proposes that the Digital Services Act follow a sector and problem-specific approach and make a clear distinction between illegal and harmful content when elaborating the appropriate policy options;
Amendment 98 #
Motion for a resolution
Paragraph 2 d (new)
Paragraph 2 d (new)
2d. Proposes that the Digital Services Act set the obligation for digital service providers without a permanent establishment in the EU to designate a legal representative for the interest of users within the European Union and to make the contact information of this representative visible and accessible on its website;
Amendment 99 #
Motion for a resolution
Paragraph 2 e (new)
Paragraph 2 e (new)
2e. Underlines the importance that online platforms hosting or moderating content online should bear more responsibility for the content they host and should act in order to proactively prevent illegality;
Amendment 105 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Considers that following the actions of digital service providers any final decision on the legality of user- generated content must be made by an independent judiciary and not a private commercial entity;
Amendment 110 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Insists that the regulation must proscribe content moderation practices that are discriminatoryproportionate or unduly go beyond the purpose of protection under the law;
Amendment 118 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Recommends the establishment of a European Agency tasked with monitoring and enforcing compliance with contractual rights as regards content management, auditing any algorithms used fornetwork of national authorities tasked with monitoring the practice of automated content moderationfiltering and curation, and imposing penalties for non-compliancereporting to the EU institutions;
Amendment 129 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Suggests that content hosting platformdigital service providers regularly submit transparency reports to the European Agencynetwork of national authorities and the European Commission, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms publish, statistics and data related to the automated content filtering and their decisions on removing user- generated content on a publicly accessible database;
Amendment 138 #
Motion for a resolution
Paragraph 7
Paragraph 7
7. RecommendConsiders the establishment of independent dispute settlement bodies in the Member States, tasked with settling disputes regarding content moderation;
Amendment 149 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Actdigital service providers to employ automated filtering mechanism that goes beyond the level of protection required by the law, however encourages digital service providers to employ such a mechanism in order to combat against illegal content online;
Amendment 242 #
Motion for a resolution
Annex I – part A – part I – section 1 –– indent 1 b (new)
Annex I – part A – part I – section 1 –– indent 1 b (new)
- It should make a clear distinction between illegal and harmful content when it comes to applying the appropriate policy options.
Amendment 244 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 2
Annex I – part A – part I – section 1 –indent 2
- It should provide principles for content moderation, including as regards discriminatory content moderation practices.
Amendment 254 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 4
Annex I – part A – part I – section 1 –indent 4
- It should provide rules for an independent dispute settlement mechanism by respecting the national competences of the Member States.
Amendment 264 #
Motion for a resolution
Annex I – part A – part I – section 2 – introductory part
Annex I – part A – part I – section 2 – introductory part
A European Agency on Content Managementnetwork of national authorities should be established with the following main tasks:
Amendment 269 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1 a (new)
Annex I – part A – part I – section 2 – indent 1 a (new)
- regular monitoring the practice of automated content filtering and curation, and reporting to the EU institutions;
Amendment 275 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 3 a (new)
Annex I – part A – part I – section 2 – indent 3 a (new)
- cooperate and coordinate with the national authorities of Member States related to the implementation of the Digital Services Act.
Amendment 309 #
Motion for a resolution
Annex I – part A – part I – section 3 –– introductory part
Annex I – part A – part I – section 3 –– introductory part
The Digital Services Act should contain provisions requiring content hosting platforms to regularly provide transparency reports to the AgencyCommission and the network of national authorities. Such reports should, in particular, include:
Amendment 377 #
Motion for a resolution
Annex I – part B – recital 9
Annex I – part B – recital 9
Amendment 383 #
Motion for a resolution
Annex I – part B – recital 9 a (new)
Annex I – part B – recital 9 a (new)
(9a) This Regulation does not prevent platforms from using an automated content mechanism where necessary and justified, and in particular promotes the use of such mechanism in the case the illegal nature of the content has either been established by a court or it can be easily determined without contextualisation.
Amendment 384 #
Motion for a resolution
Annex I – part B – recital 10
Annex I – part B – recital 10
(10) This Regulation should also include provisions against discriminatory content moderation practices, especially when user-created content is removed based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social clasunjustified content moderation practices.
Amendment 402 #
Motion for a resolution
Annex I – part B – recital 21
Annex I – part B – recital 21
Amendment 417 #
Motion for a resolution
Annex I – part B – Article 3 –point 2
Annex I – part B – Article 3 –point 2
(2) ‘'illegal content’' means any concept, idea, expression or information in any format such as text, images, audio and videoinformation which is not in compliance with Union law or the law of a Member State concerned;
Amendment 428 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2
Annex I – part B – Article 4 – paragraph 2
Amendment 434 #
Motion for a resolution
Annex I – part B – Article 4 a (new)
Annex I – part B – Article 4 a (new)
Amendment 437 #
Motion for a resolution
Annex I – part B – Article 4 b (new)
Annex I – part B – Article 4 b (new)
Article 4b Transparency obligation 1. Digital services actively hosting or moderating online content shall take the necessary measures in order to disclose the funding and the power of interest groups behind those using their services so that the person legally responsible and accountable should be identifiable. 2. Digital service providers without a permanent establishment in the EU shall designate a legal representative for user interest within the European Union and make the contact information of this representative visible and accessible on their websites.
Amendment 470 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1
Annex I – part B – Article 12 – paragraph 1
Amendment 474 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1 a (new)
Annex I – part B – Article 12 – paragraph 1 a (new)
Digital service providers should act expeditiously to make unavailable or remove illegal content that has been notified to them and make best efforts to prevent future uploads of the same content.
Amendment 485 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 3 a (new)
Annex I – part B – Article 14 – paragraph 3 a (new)
3a. Both the place where the content has been uploaded and accessed shall be deemed to constitute a ground of jurisdiction