BETA

20 Amendments of Ramona STRUGARIU related to 2020/0361(COD)

Amendment 185 #
Proposal for a regulation
Recital 2 a (new)
(2a) Member States also undertake to promote, through multilateral agreements such as the International Partnership for Information and Democracy initiated by Reporters Without Borders and signed by 21 EU Member States, the Regulation of the public information and communication space by establishing democratic guarantees for the digital space, based on the responsibility of platforms and guarantees for the reliability of information. These multilateral commitments offer convergent solutions on matters covered by this Regulation.
2021/07/08
Committee: IMCO
Amendment 342 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, health and trust of the recipients of the service, including minors, women and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empowerprovide recourse to recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 462 #
Proposal for a regulation
Recital 52 a (new)
(52a) Very large online platforms using recommendation systems should be bound by the obligation to promote the reliability of information (due prominence obligation), by implementing mechanisms that refer to a self -regulatory standard, highlighting information sources that respect standardized professional and ethical self-regulatory standards, such platforms should in turn give them preferential treatment by prioritizing their content; a must-carry obligation should ensure that recommender systems display information from trustworthy sources, such as public authorities, scientific sources or public interest journalism as first results following search queries in areas of public interests;
2021/07/08
Committee: IMCO
Amendment 464 #
Proposal for a regulation
Recital 52 b (new)
(52b) Providers of public interest journalism should be identified through voluntary, self-regulatory European standards or European standardisation deliverables as defined by Regulation (EU) No 1025/2012 of the European Parliament and of the Council ('technical standards'), transparently developed, governed and enforced and such standards must be based on internationally accepted best-practices and ethical norms;
2021/07/08
Committee: IMCO
Amendment 478 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service through the submission of abusive notices, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 509 #
Proposal for a regulation
Recital 65 a (new)
(65a) Any change on the recommender systems used by platforms to suggest, rank and prioritise information can have a dramatic impact on the users, in particular on the media that widely rely on platforms to be accessible to their audience; consequently, providers of online platforms should be transparent about any changes operated in their referencing and recommendation rules, even if made on an experimental basis, and immediately inform the regulators, their users and the authors of referenced content, allowing these changes to be predictable to those affected by them; users should be able to refer to the regulator asking it to give its opinion on the negative impact of changes in the referencing and recommendation rules, allowing it to require the platform to remedy this impact.
2021/07/08
Committee: IMCO
Amendment 911 #
Proposal for a regulation
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of theeach Member States where the provider offers its services.
2021/07/08
Committee: IMCO
Amendment 912 #
Proposal for a regulation
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of theeach Member States where the provider offers its services.
2021/07/08
Committee: IMCO
Amendment 1517 #
Proposal for a regulation
Article 24 a (new)
Article 24a Recommender systems - prominence of public journalism 1. Online platforms shall ensure due prominence of public interest journalism on their services. Services that cater to special interests may be exempted from this obligation. Appropriate prominence measures should include the use of technical standards established in a participatory and transparent manner in order to identify media outlets and entities operating according to the highest, internationally recognized professional norms to produce reliable and accurate information. 2. Providers of public interest journalism shall be identified through voluntary, self-regulatory European standards or European standardization deliverables as defined by Regulation (EU) No. 1025/2012 (‘technical standards’), which are transparently developed, governed and enforced. Any of those standards shall be based on internationally accepted best-practices and ethical norms to serve as legitimate criteria to implement the due prominence obligation. The application of these technical standards must be attributed and disclosed by and to all parties involved. 3. Appropriate measures as per this provision shall not discriminate on the basis of content or viewpoint. Intermediaries shall not treat non- compliance with or non-usage of such technical standards as a reason to exclude, down rank, demote or otherwise actively affect the visibility or monetization of content in a negative way. In order to demonstrate compliance with their duty to ensure due prominence for public interest journalism on their services, online intermediaries shall establish mandatory transparent mechanisms and metrics of indexation, regarding the discoverability and visibility in search ranks, news feeds and products, including the provision of data and information on prioritization, personalization, and recommendation algorithms, audits and complaints in an accountable manner. 4. A Digital Services Coordinator shall monitor and assess if appropriate measures adopted by online intermediaries under this article are sufficient to contribute to media pluralism and diversity in their respective national markets. To this end, the Digital Services Coordinator should rely on self- regulatory and co-regulatory mechanisms. 5. Recipients of services shall always have a clear and easily accessible choice to opt out of the appropriate measures designed to ensure due prominence to public interest journalism.
2021/07/08
Committee: IMCO
Amendment 1522 #
Proposal for a regulation
Article 24 b (new)
Article 24b Transparency on algorithm modifications 1. Providers of online platforms shall be transparent about changes in their referencing and recommendation rules, even if made on an experimental basis, and shall immediately inform the regulators, their users and the authors of referenced content, allowing these changes to be foreseen by those affected by them. 2. Users may refer to the regulator to ask it to give its opinion on the negative impact of changes to the referencing and recommendation rules, so that it can require the platform to remedy this impact.
2021/07/08
Committee: IMCO
Amendment 1525 #
Proposal for a regulation
Article 24 c (new)
Article 24c Neutrality Very large online platforms are subject to an obligation of political, ideological or religious neutrality, and may not promote political parties, opinions, or ideas.
2021/07/08
Committee: IMCO
Amendment 1550 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1563 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1606 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1626 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in Article 28(3).
2021/07/08
Committee: IMCO
Amendment 1658 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27
2021/07/08
Committee: IMCO
Amendment 1815 #
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
1. The Commission shall support and promote the development and implementation of voluntary industry standards or standardisation deliverables set by relevant European and international standardisation bodies at least for the following:
2021/07/08
Committee: IMCO
Amendment 1828 #
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
(fa) self-regulatory, certifiable and machine-readable criteria for the transparency of ownership and professionalism of editorial processes to identify reliable sources of information pursuant to Article 24 a;
2021/07/08
Committee: IMCO
Amendment 2120 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/08
Committee: IMCO
Amendment 2130 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/08
Committee: IMCO