Activities of Leszek MILLER related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (continuation of debate)
Amendments (11)
Amendment 701 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
Article 2 – paragraph 1 – point h a (new)
(ha) ‘editorial platform’ means an intermediary service which is in connection with a press publication within the meaning of Article 2(4) of Directive (EU) 2019/790 or another editorial media service and which allows users to discuss topics generally covered by the relevant media or to comment editorial content and which is under the supervision of the editorial team of the publication or other editorial media.
Amendment 910 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. Very large online platforms shall designate a legal representative in each of the Member States where the provider offers its services.
Amendment 947 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Where very large online platforms within the meaning of Article 25 of this Regulation otherwise allow for the dissemination to the public of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790, such platforms shall not remove, disable access to, suspend or otherwise interfere with such content or the related service or suspend or terminate the related account on the basis of the alleged incompatibility of such content with its terms and conditions.
Amendment 970 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Article 12a Exclusions Articles 12 and 13 of Section 1, and the provisions of Section 2, and Section 3 of Chapter III shall not apply to: (a) editorial platforms within the meaning of Article 2(h) of this Regulation; (b) online platforms that qualify as micro and medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC; (c) an intermediary service, except very large online platforms, where it would constitute a disproportionate burden in view of its size, the nature of its activity and the risk posed to users.
Amendment 1569 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 1579 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 1586 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
Amendment 1650 #
Proposal for a regulation
Article 27 a (new)
Article 27 a (new)
Article 27a Mitigation of risks for the freedom of expression and freedom and pluralism of the media 1. Where specific systemic risks for the exercise of freedom of expression and freedom and pluralism of the media pursuant to Article 26(1)(b) emerge, very large online platforms shall ensure that the exercise of these fundamental rights is always adequately and effectively protected. 2. Where very large online platforms allow for the dissemination of press publications within the meaning of Art. 2(4) of Directive (EU) 2019/790, of audiovisual media services within the meaning of Article 1(1)(a) of Directive 2010/13/EU(AVMS) or of other editorial media, which are published in compliance with applicable Union and national law under the editorial responsibility and control of a press publisher, audiovisual or other media service provider, who can be held liable under the laws of a Member State, the platforms shall be prohibited from removing, disabling access to, suspending or otherwise interfering with such content or services or suspending or terminating the service providers’ accounts on the basis of the alleged incompatibility of such content with their terms and conditions. 3. Very large online platforms shall ensure that their content moderation, their decision-making processes, the features or functioning of their services, their terms and conditions and recommender systems are objective, fair and non-discriminatory.
Amendment 1697 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1a. The parameters used in recommender systems shall always be fair and non-discriminatory.
Amendment 1854 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1)(a) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1865 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives in relation to the dissemination of illegal content, and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.