BETA

14 Amendments of Rob ROOKEN related to 2020/0361(COD)

Amendment 236 #
Proposal for a regulation
Recital 62 a (new)
(62a) Given that very large online platforms play an essential role in the provision of information to consumers, their general terms and conditions should not restrict fundamental rights, in particular the right to freedom of expression, beyond the limits of the law. Nevertheless, they must be able to give users the option of not viewing certain content, although this decision must always lie with the user and never be promoted by any of the very large online platforms. The latter must ensure at all times that a 'full free speech' mode is available to all adult users, enabling them to view all content that cannot be considered manifestly illegal, in accordance with the laws of the various Member States.
2021/06/10
Committee: LIBE
Amendment 237 #
Proposal for a regulation
Recital 62 b (new)
(62b) It must be possible to hold very large online platforms accountable for improperly removing content or restricting user access and they should accordingly be liable to pay a minimum amount in damages for so doing. This will increase the willingness of the public to ascertain whether certain content is illegal or protected by freedom of expression, not to mention encouraging very large online platforms to exercise caution when removing content.
2021/06/10
Committee: LIBE
Amendment 238 #
Proposal for a regulation
Recital 62 c (new)
(62c) Given the imbalance between consumers and the very large online platforms, especially regarding legal expertise and financial resources, it is only fair for Member States to establish a freedom of speech procedure, facilitating the fully digital presentation by users to a judicial authority in their Member State of content removed by one of the very large online platforms. The latter have the technical means to forward deleted content to the judicial authority in question at the touch of a button. The judicial authority must then decide as soon as possible, but within no more than three working days, whether the deleted content is manifestly illegal. If the content is not manifestly illegal, the very large online platform must immediately place the content back online and compensate the user no later than seven working days after the latter has provided all information necessary for settlement of the damages. Within 14 days of the decision of the national legal authority, the very large online platform must ensure that action is taken to remedy the infringement. It must incorporate the decisions of the national judicial authorities into the algorithms they use to assess, where necessary, whether content should be deleted.
2021/06/10
Committee: LIBE
Amendment 270 #
Proposal for a regulation
Recital 105
(105) This Regulation respects the fundamental rights recognised by the Charter and the fundamental rights constituting general principles of Union law. Accordingly, this Regulation should be interpreted and applied in accordance with those fundamental rights, includingespecially the freedom of expression and information, as well as the freedom and pluralism of the media. When exercising the powers set out in this Regulation, all public authorities involved should achieve, in situations where the relevant fundamental rights conflict, a fair balance between the rights concerned, in accordance with the principle of proportionalitystrike a fair balance between conflicting fundamental rights, while attaching particular weight to freedom of expression, a fundamental right that is one of the cornerstones of a democratic society, something that may not be true of other fundamental rights, notwithstanding their importance. Where fundamental rights enter into conflict with each other, account must be taken of the applicable case law, especially that established by the European Court of Human Rights and the Member State courts responsible, in determining which shall prevail in any particular case.
2021/06/10
Committee: LIBE
Amendment 271 #
Proposal for a regulation
Recital 106
(106) Since the objective of this Regulation, namely the proper functioning of the internal market and to ensure a technologically safe, predictable and trusted online environment in which the fundamental rights enshrined in the Charter are duly protected, cannot be sufficiently achieved by the Member States because they cannot achieve the necessary harmonisation and cooperation by acting alone, but can rather, by reason of its territorial and personal scope, be better achieved at the Union level, the Union may adopt measures, in accordance with the principle of subsidiarity as set out in Article 5 of the Treaty on European Union. In accordance with the principle of proportionality, as set out in that Article, this Regulation does not go beyond what is necessary in order to achieve that objective,
2021/06/10
Committee: LIBE
Amendment 274 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a technologically safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/06/10
Committee: LIBE
Amendment 275 #
Proposal for a regulation
Article 1 – paragraph 2 – subparagraph 1 (new)
(c) ensure that the fundamental rights of each and every individual, in particular the fundamental right to freedom of expression, are effectively protected and can be exercised freely within the limits set by law.
2021/06/10
Committee: LIBE
Amendment 287 #
Proposal for a regulation
Article 2 – paragraph 1 – point g a (new)
(ga) 'manifestly illegal content' means content that can be established beyond reasonable doubt as being contrary to the law; content that includes sexual acts with minors or direct incitement to violence is, in every case, manifestly illegal;
2021/06/10
Committee: LIBE
Amendment 561 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
2021/06/10
Committee: LIBE
Amendment 562 #
Proposal for a regulation
Article 20 – paragraph 1 a (new)
1a. Online platforms shall immediately withdraw their services from users who are manifestly distributing illegal content. They shall resume services and compensate users where a judicial authority has been unable to establish the illegal nature of the content in question. They shall resume the provision of services to users able to show that they have not been distributing the illegal content, especially where it emerges that a third party has obtained unauthorised access to a user's account.
2021/06/10
Committee: LIBE
Amendment 586 #
Proposal for a regulation
Article 21 a (new)
Article 21a. Compensation for erroneous removal of content Online platforms shall compensate users for wrongful termination of services pursuant to Article 20(1) or (2). An online platform shall be liable to pay minimum compensation for termination of service under Article 20(1), amounting to EUR 1 000 where the user is a natural person and EUR 2 500 per day where the user is a business entity. This shall be without prejudice to the right of the user to seek reimbursement of actual damages. An online platform shall be liable to pay minimum compensation of EUR 5 000 for termination of service under Article 20(2).
2021/06/10
Committee: LIBE
Amendment 626 #
Proposal for a regulation
Article 26 – paragraph 1 – point b a (new)
(ba) In carrying out risk assessment, very large online platforms shall pay particular attention to their general terms and conditions, regardless of their form or name, and consider how these general terms and conditions relate to the right to freedom of expression, as laid down in Article 11 of the Charter and Article 10 of the ECHR. Very large online platforms shall also closely comply with ECHR case law. They shall ensure that their general terms and conditions do not limit freedom of expression beyond the limits set by law;
2021/06/10
Committee: LIBE
Amendment 630 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conductingIn carrying out risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.pay particular attention to their terms and conditions, regardless of their form or name, and see how they relate to the right to freedom of expression, as laid down in Article 11 of the Charter and Article 10 of the ECHR. They shall also closely comply with ECHR case law. They shall ensure that their terms and conditions do not limit freedom of expression beyond the limits set by law;
2021/06/10
Committee: LIBE
Amendment 634 #
Proposal for a regulation
Article 26 a (new)
Article 26a. Very large online platforms shall not remove content or restrict access to their platform under their own authority unless the user is manifestly distributing illegal content. Removal of content or the restriction of access to the platform in the event of systematic or non-systematic distribution of illegal content may be authorised only through a court order.
2021/06/10
Committee: LIBE