BETA

37 Amendments of Lídia PEREIRA related to 2020/0361(COD)

Amendment 140 #
Proposal for a regulation
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation and the competitiveness of European companies should not be hampered but instead be stimulated.
2021/09/10
Committee: ECON
Amendment 161 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers in connection with information relating to illegal content, products, services and activities. In particular, thatThe illegal nature of such content, products or services is defined by relevant Union law or national law in accordance with Union law. The concept should be understood, for example, to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/09/10
Committee: ECON
Amendment 174 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation, without prejudice to Article 6, in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/09/10
Committee: ECON
Amendment 216 #
Proposal for a regulation
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. In this regard, the Commission and Digital Service Coordinators may work together on information and guidelines for the voluntary implementation of the provisions in this Regulation for micro or small enterprises. Furthermore, the Commission and Digital Services Coordinators are also encouraged to do so for medium enterprises, which while not benefitting from the liability exemptions in Section 3, may sometimes lack the legal resources necessary to ensure proper understanding and compliance with all provisions. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/09/10
Committee: ECON
Amendment 220 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. FSuch entities can also include businesses who have a vested interest in flagging counterfeit products of their brand thus ensuring the online consumer experience is safer and more reliable. Similarly, for intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/09/10
Committee: ECON
Amendment 253 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. Such reinforcement could include the expansion and resource allocation to content moderation in languages other than English. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/09/10
Committee: ECON
Amendment 258 #
Proposal for a regulation
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform systematically does not comply with this Regulation or the commitments undertaken. A disclaimer of an opinion should be given where the auditor does not have enough information to conclude on an opinion due to the novelty of the issues audited.
2021/09/10
Committee: ECON
Amendment 347 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures for the implementation of community rules and guidelines of their services, or to comply with the requirements of Union law, including those set out in this Regulation, or national law in accordance with Union law.
2021/09/10
Committee: ECON
Amendment 354 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
- the order is transmitted via secure channels established between the relevant national judicial or administrative authorities and the providers of intermediary services;
2021/09/10
Committee: ECON
Amendment 356 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1 (new)
In extraordinary cases, where the intermediary service has reasonable doubt that the removal order is not legally sound, the intermediary service should have access to a mechanism to challenge the decision. This mechanism shall be established by the Digital Services Coordinators in coordination with the Board and the Commission.
2021/09/10
Committee: ECON
Amendment 359 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 a (new)
- the order is transmitted via secure channels established between the relevant national judicial or administrative authorities and the providers of intermediary services.
2021/09/10
Committee: ECON
Amendment 409 #
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1 (new)
Where a provider of hosting services decides to not remove or disable access to specific items of information provided by the recipients of the service, detected through the mechanisms established in Article 14, it shall inform the user who notified the online platform of the content and where needed, the recipient of the decision without undue delay. The notification of such a decision can be done through automated means.
2021/09/10
Committee: ECON
Amendment 415 #
Proposal for a regulation
Article 16 – paragraph 1 – subparagraph 1 (new)
The Commission and Digital Service Coordinators may work together on information and guidelines for the voluntary implementation of the provisions in this Regulation for micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/09/10
Committee: ECON
Amendment 420 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable or restrict access to the information;
2021/09/10
Committee: ECON
Amendment 421 #
Proposal for a regulation
Article 17 – paragraph 1 – subparagraph 1 (new)
Complaints can also be lodged against decisions made by the online platform to not remove, not disable, not suspend and not terminate access to accounts.
2021/09/10
Committee: ECON
Amendment 426 #
Proposal for a regulation
Article 17 – paragraph 3 – point a (new)
(a) Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is indeed illegal and is incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does warrant the suspension or termination of the service or the account, it shall also reserve its decision referred to in Paragraph 1 without undue delay.
2021/09/10
Committee: ECON
Amendment 438 #
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Commission or by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:
2021/09/10
Committee: ECON
Amendment 439 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform except in the cases of businesses with a vested interest in flagging counterfeit products of their brand thus ensuring the online consumer experience is safer and more reliable;
2021/09/10
Committee: ECON
Amendment 443 #
Proposal for a regulation
Article 19 – paragraph 3
3. Digital Services Coordinators and the Commission shall communicate to the Commissioneach other and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2.
2021/09/10
Committee: ECON
Amendment 445 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinatorauthority that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
2021/09/10
Committee: ECON
Amendment 447 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinatorauthority that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
2021/09/10
Committee: ECON
Amendment 459 #
Proposal for a regulation
Article 21 – paragraph 2 – introductory part
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europolhas its main establishment or its legal representative and also transmit the information to Europol for appropriate follow up.
2021/09/10
Committee: ECON
Amendment 468 #
Proposal for a regulation
Article 22 – paragraph 1 – subparagraph 1 (new)
Online platforms that facilitate the sale of harmonised consumer goods between a seller in a third country and a consumer in the EU and where there is no other manufacturer or importer in the EU, should verify that the product bears the required conformity mark (CE mark) and that it has other relevant documents (e.g. EU declaration of conformity). Traders from within the Union and from third countries should also have the option to voluntarily upload the relevant documents certifying that their goods meet the consumer protection standards of the EU. If the traders choose to do so, online platforms may then show proof of these documents to users as part of the user interface to instil more consumer confidence in the distance contracts conducted on their platforms.
2021/09/10
Committee: ECON
Amendment 470 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. Provided that the online platform has made reasonable efforts to assess the information in points (a), (d) and (e), online platform shall not be held liable for information provided by the trader that ends up being inaccurate.
2021/09/10
Committee: ECON
Amendment 471 #
Proposal for a regulation
Article 22 – paragraph 3 – introductory part
3. Where the online platform obtains indications, through its reasonable efforts under paragraph 2 or through Member States' consumer authorities, that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/09/10
Committee: ECON
Amendment 490 #
Proposal for a regulation
Article 25 – paragraph 2
2. The Commission shall adopt delegated acts in accordance with Article 69ould be able to update this Regulation through legislative acts in accordance with Article 294 of TFEU. Such revisions may be necessary to adjust the number of average monthly recipients of the service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least with 5 % in relation to its population in 2020 or, after adjustment by means of a delegatedislative act, of its population in the year in which the latest delegatedislative act was adopted. In that case, it shall adjust the number so that it corresponds to 10% of the Union’s population in the year in which it adopts the delegatedislative act, rounded up or down to allow the number to be expressed in millions.
2021/09/10
Committee: ECON
Amendment 515 #
Proposal for a regulation
Article 27 – paragraph 2 – subparagraph 1 (new)
(c) measures taken by the Digital Service Coordinators, the Board and the Commission to ensure that highly sensitive information and business secrets are kept confidential.
2021/09/10
Committee: ECON
Amendment 521 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned and have not provided any other service to the platform in the previous 12 months;
2021/09/10
Committee: ECON
Amendment 524 #
Proposal for a regulation
Article 28 – paragraph 2 – subparagraph 1 (new)
(d) have not provided an audit to the same very large online platform for more than three consecutive years.
2021/09/10
Committee: ECON
Amendment 525 #
Proposal for a regulation
Article 28 – paragraph 3 – point f
(f) where the audit opinion is not posiegative, operational recommendations on specific measures to achieve compliance. and risk- based remediation timelines with a focus on rectifying issues that have the potential to cause most harm to users of the service as a priority;
2021/09/10
Committee: ECON
Amendment 526 #
Proposal for a regulation
Article 28 – paragraph 3 – subparagraph 1 (new)
(g) where the organisations that perform the audits do not have enough information to conclude an opinion due to the novelty of the issues audited, a disclaimer shall be given.
2021/09/10
Committee: ECON
Amendment 543 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. The delegated acts should also lay out the technical conditions needed to ensure confidentiality and security of information by the vetted researchers once they acquire access to the data, including guidelines for academics who wish to publish findings based on the confidential data acquired. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/09/10
Committee: ECON
Amendment 560 #
Proposal for a regulation
Article 39 – paragraph 1 – subparagraph 1 (new)
Member States shall designate the status of Digital Services Coordinator based on the following criteria: (a) the authority has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; (b) it represents collective interests and is independent from any online platform; (c) it has the capacity to carry out its activities in a timely, diligent and objective manner.
2021/09/10
Committee: ECON
Amendment 574 #
Proposal for a regulation
Article 44 – paragraph 1
1. Digital Services Coordinators shall draw up an annual reports on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission and to the Board.
2021/09/10
Committee: ECON
Amendment 575 #
Proposal for a regulation
Article 44 – paragraph 2 – point b a (new)
(ba) measures taken by the Digital Service Coordinators to ensure that highly sensitive information and business secrets are kept confidential;
2021/09/10
Committee: ECON
Amendment 576 #
Proposal for a regulation
Article 44 – paragraph 2 – point b b (new)
(bb) an assessment of the interpretation of the Country of Origin principle in the supervisory and enforcement activities of the Digital Services Coordinators, especially in regards to Article 45 of this Regulation.
2021/09/10
Committee: ECON
Amendment 607 #
Proposal for a regulation
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms, without prejudice to Directive (EU) 2016/943 on trade secrets.
2021/09/10
Committee: ECON