BETA

32 Amendments of Javier ZARZALEJOS related to 2020/2019(INL)

Amendment 5 #
Motion for a resolution
Citation 7 b (new)
- having regard to the Recommendation of the Commission of 1 March 2018 on measures to effectively tackle illegal content online (C(2018) 1177),
2020/06/05
Committee: JURI
Amendment 9 #
Motion for a resolution
Citation 7 f (new)
- having regard to the Directive (EU) 2017/541/EU of the European Parliament and of the Council of 15 March 2017 on combating terrorism,
2020/06/05
Committee: JURI
Amendment 22 #
Motion for a resolution
Recital B a (new)
Ba. whereas digital services are used by the majority of Europeans on a daily basis, but are subject to an increasingly wide set of rules across the EU leading to significant fragmentation on the market and consequently legal uncertainty for European users and services operating cross-borders, combined with lack of regulatory control on key aspects of today's information environment;
2020/06/05
Committee: JURI
Amendment 54 #
Motion for a resolution
Recital H
H. whereas content hosting platforms often employ automated content removal mechanisms that raise legitimate rule of law concerns, in particular when they are encouraged to employ such mechanisms pro-actively and voluntarily, resulting in content removal taking place without a clear legal basis, which is in contravention of Article 10 of the European Convention on Human Rights, stating that formalities, conditions, restrictions or penalties governing the exercise of freedom of expression and information must be prescribed by law;deleted
2020/06/05
Committee: JURI
Amendment 60 #
Motion for a resolution
Recital H a (new)
Ha. whereas automated content removal mechanisms of digital service providers should be proportionate, covering only those justified cases, where the benefits of removing content outweigh the potential disadvantages of keeping content online; whereas these procedures should be also transparent and their terms and conditions should be made known prior to the users would use the service;
2020/06/05
Committee: JURI
Amendment 93 #
Motion for a resolution
Paragraph 2 a (new)
2a. Proposes that the Digital Services Act follow a sector and problem-specific approach and make a clear distinction between illegal and harmful content when elaborating the appropriate policy options;
2020/06/05
Committee: JURI
Amendment 98 #
Motion for a resolution
Paragraph 2 d (new)
2d. Proposes that the Digital Services Act set the obligation for digital service providers without a permanent establishment in the EU to designate a legal representative for the interest of users within the European Union and to make the contact information of this representative visible and accessible on its website;
2020/06/05
Committee: JURI
Amendment 99 #
Motion for a resolution
Paragraph 2 e (new)
2e. Underlines the importance that online platforms hosting or moderating content online should bear more responsibility for the content they host and should act in order to proactively prevent illegality;
2020/06/05
Committee: JURI
Amendment 105 #
Motion for a resolution
Paragraph 3
3. Considers that following the actions of digital service providers any final decision on the legality of user- generated content must be made by an independent judiciary and not a private commercial entity;
2020/06/05
Committee: JURI
Amendment 110 #
Motion for a resolution
Paragraph 4
4. Insists that the regulation must proscribe content moderation practices that are discriminatoryproportionate or unduly go beyond the purpose of protection under the law;
2020/06/05
Committee: JURI
Amendment 118 #
Motion for a resolution
Paragraph 5
5. Recommends the establishment of a European Agency tasked with monitoring and enforcing compliance with contractual rights as regards content management, auditing any algorithms used fornetwork of national authorities tasked with monitoring the practice of automated content moderationfiltering and curation, and imposing penalties for non-compliancereporting to the EU institutions;
2020/06/05
Committee: JURI
Amendment 129 #
Motion for a resolution
Paragraph 6
6. Suggests that content hosting platformdigital service providers regularly submit transparency reports to the European Agencynetwork of national authorities and the European Commission, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms publish, statistics and data related to the automated content filtering and their decisions on removing user- generated content on a publicly accessible database;
2020/06/05
Committee: JURI
Amendment 138 #
Motion for a resolution
Paragraph 7
7. RecommendConsiders the establishment of independent dispute settlement bodies in the Member States, tasked with settling disputes regarding content moderation;
2020/06/05
Committee: JURI
Amendment 149 #
Motion for a resolution
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Actdigital service providers to employ automated filtering mechanism that goes beyond the level of protection required by the law, however encourages digital service providers to employ such a mechanism in order to combat against illegal content online;
2020/06/05
Committee: JURI
Amendment 242 #
Motion for a resolution
Annex I – part A – part I – section 1 –– indent 1 b (new)
- It should make a clear distinction between illegal and harmful content when it comes to applying the appropriate policy options.
2020/06/05
Committee: JURI
Amendment 244 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 2
- It should provide principles for content moderation, including as regards discriminatory content moderation practices.
2020/06/05
Committee: JURI
Amendment 254 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 4
- It should provide rules for an independent dispute settlement mechanism by respecting the national competences of the Member States.
2020/06/05
Committee: JURI
Amendment 264 #
Motion for a resolution
Annex I – part A – part I – section 2 – introductory part
A European Agency on Content Managementnetwork of national authorities should be established with the following main tasks:
2020/06/05
Committee: JURI
Amendment 269 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1 a (new)
- regular monitoring the practice of automated content filtering and curation, and reporting to the EU institutions;
2020/06/05
Committee: JURI
Amendment 275 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 3 a (new)
- cooperate and coordinate with the national authorities of Member States related to the implementation of the Digital Services Act.
2020/06/05
Committee: JURI
Amendment 309 #
Motion for a resolution
Annex I – part A – part I – section 3 –– introductory part
The Digital Services Act should contain provisions requiring content hosting platforms to regularly provide transparency reports to the AgencyCommission and the network of national authorities. Such reports should, in particular, include:
2020/06/05
Committee: JURI
Amendment 377 #
Motion for a resolution
Annex I – part B – recital 9
(9) This Regulation should not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante control of content.deleted
2020/06/05
Committee: JURI
Amendment 383 #
Motion for a resolution
Annex I – part B – recital 9 a (new)
(9a) This Regulation does not prevent platforms from using an automated content mechanism where necessary and justified, and in particular promotes the use of such mechanism in the case the illegal nature of the content has either been established by a court or it can be easily determined without contextualisation.
2020/06/05
Committee: JURI
Amendment 384 #
Motion for a resolution
Annex I – part B – recital 10
(10) This Regulation should also include provisions against discriminatory content moderation practices, especially when user-created content is removed based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social clasunjustified content moderation practices.
2020/06/05
Committee: JURI
Amendment 402 #
Motion for a resolution
Annex I – part B – recital 21
(21) Action at Union level as set out in this Regulation would be substantially enhanced with the establishment of a Union agency tasked with monitoring and ensuring compliance by content hosting platforms with the provisions of this Regulation. The Agency should review compliance with the standards laid down for content management on the basis of transparency reports and an audit of algorithms employed by content hosting platforms for the purpose of content management ‒deleted
2020/06/05
Committee: JURI
Amendment 417 #
Motion for a resolution
Annex I – part B – Article 3 –point 2
(2) 'illegal content' means any concept, idea, expression or information in any format such as text, images, audio and videoinformation which is not in compliance with Union law or the law of a Member State concerned;
2020/06/05
Committee: JURI
Amendment 428 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2
2. Users shall not be subjected to discriminatory content moderation practices by the content hosting platforms, such as removal of user-generated content based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social class.deleted
2020/06/05
Committee: JURI
Amendment 434 #
Motion for a resolution
Annex I – part B – Article 4 a (new)
Article 4a Voluntary action 1. Without prejudice to Articles 12-14 of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce', a content hosting platform is not liable for any of the information that it stores, indexes, makes available or transmits simply by virtue of the fact that it has taken voluntary action in good faith, whether of an automated or anon- automated nature, to identify, remove, disable access to, or otherwise restrict information or activity that the service provider reasonably considers to be illegal or otherwise objectionable. 2. Where a content hosting platform takes voluntary action in accordance with Paragraph 1: (a) it shall not be taken to imply that, as a result of the voluntary action, the content hosting platform has knowledge of or control over the information which it transmits or stores; (b) nor shall it be taken to imply that, as a result of the voluntary action, the activity of the content hosting platform is not of a mere technical, automatic and passive nature. 3. Paragraphs 1 and 2 shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the content hosting platform to terminate or prevent an infringement.
2020/06/05
Committee: JURI
Amendment 437 #
Motion for a resolution
Annex I – part B – Article 4 b (new)
Article 4b Transparency obligation 1. Digital services actively hosting or moderating online content shall take the necessary measures in order to disclose the funding and the power of interest groups behind those using their services so that the person legally responsible and accountable should be identifiable. 2. Digital service providers without a permanent establishment in the EU shall designate a legal representative for user interest within the European Union and make the contact information of this representative visible and accessible on their websites.
2020/06/05
Committee: JURI
Amendment 470 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1
Without prejudice to judicial or administrative orders regarding content online, content that has been the subject of a notice shall remain visible until a final decision has been taken regarding its removal or takedown.deleted
2020/06/05
Committee: JURI
Amendment 474 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1 a (new)
Digital service providers should act expeditiously to make unavailable or remove illegal content that has been notified to them and make best efforts to prevent future uploads of the same content.
2020/06/05
Committee: JURI
Amendment 485 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 3 a (new)
3a. Both the place where the content has been uploaded and accessed shall be deemed to constitute a ground of jurisdiction
2020/06/05
Committee: JURI