BETA

Activities of Gilles LEBRETON related to 2020/0361(COD)

Legal basis opinions (0)

Amendments (94)

Amendment 122 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union lawthe criminal, administrative or civil legal framework of a Member State and what the precise nature or subject matter is of the law in question.
2021/07/20
Committee: JURI
Amendment 159 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of national and Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/07/20
Committee: JURI
Amendment 169 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services that do not qualify as very large online platform should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance and with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
2021/07/20
Committee: JURI
Amendment 189 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
2021/07/20
Committee: JURI
Amendment 216 #
Proposal for a regulation
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation or are held or controlled by entities established outside the European Union. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/20
Committee: JURI
Amendment 222 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53deleted
2021/07/20
Committee: JURI
Amendment 263 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/20
Committee: JURI
Amendment 269 #
Proposal for a regulation
Recital 59
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisationrelevant public actors.
2021/07/20
Committee: JURI
Amendment 272 #
Proposal for a regulation
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishments and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
2021/07/19
Committee: JURI
Amendment 287 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, thea Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/19
Committee: JURI
Amendment 294 #
Proposal for a regulation
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct shouldall be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
2021/07/19
Committee: JURI
Amendment 307 #
Proposal for a regulation
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commission in cooperation with the Board may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross- border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.
2021/07/19
Committee: JURI
Amendment 315 #
Proposal for a regulation
Recital 76 a (new)
(76 a) With regard to very large online platforms that offer services in the Union and therefore fall within the scope of this Regulation, Member States where individuals or representative organisations received their services shall have jurisdiction, without prejudice to the relevant consumer protection jurisdiction under national and European law.
2021/07/19
Committee: JURI
Amendment 318 #
Proposal for a regulation
Recital 81
(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross- border cooperation. Whether the relevant case involve a very large online platform, the national Digital Services Coordinator that received the complaint should be able to act and take the adequate measures under this Regulation.
2021/07/19
Committee: JURI
Amendment 320 #
Proposal for a regulation
Recital 85
(85) Where a Digital Services Coordinator requests another Digital Services Coordinator to take action, the requesting Digital Services Coordinator, or the Board in case it issued a recommendation to assess issues involving more than three Member States, should be able to refer the matter to the Commission in case of any disagreement as to the assessments or the measures taken or proposed or a failure to adopt any measures. The Commission, on the basis of the information made available by the concerned authorities, should accordingly be able to request the competent Digital Services Coordinator to re-assess the matter and take the necessary measures to ensure compliance within a defined time period. This possibility is without prejudice to the Commission’s general duty to oversee the application of, and where necessary enforce, Union law under the control of the Court of Justice of the European Union in accordance with the Treaties. A failure by the Digital Services Coordinator of establishment to take any measures pursuant to such a request may also lead to the Commission’s intervention under Section 3 of Chapter IV of this Regulation, where the suspected infringer is a very large online platform.
2021/07/19
Committee: JURI
Amendment 330 #
Proposal for a regulation
Recital 95
(95) In order to address those public policy concerns it is therefore necessary to provide for a common system of enhanced supervision and enforcement at Union level. Once an infringement of one of the provisions that solely apply to very large online platforms has been identified, for instance pursuant to individual or joint investigations, auditing or complaints, the interested Digital Services Coordinator of establishment, upon its own initiative or upon the Board's advice, should monitor any subsequent measure taken by the very large online platform concerned as set out in its action plan. That Digital Services Coordinator should be able to ask, where appropriate, for an additional, specific audit to be carried out, on a voluntary basis, to establish whether those measures are sufficient to address the infringement. At the end of that procedure, it should inform the Board, the Commission and the platform concerned of its views on whether or not that platform addressed the infringement, specifying in particular the relevant conduct and its assessment of any measures taken. The Digital Services Coordinators should perform itstheir role under this common system in a timely manner and taking utmost account of any opinions and other advice of the Board.
2021/07/19
Committee: JURI
Amendment 332 #
Proposal for a regulation
Recital 96
(96) Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission may, on its own initiative or upon advice of the Board, decide to further investigate the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the Commission should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also have such a possibility to intervene in cross- border situations where the relevant Digital Services Coordinator of establishments did not take any measures despite the Commission’s request, or in situations where thea Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform.
2021/07/19
Committee: JURI
Amendment 334 #
Proposal for a regulation
Recital 97
(97) The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary.
2021/07/19
Committee: JURI
Amendment 338 #
Proposal for a regulation
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the Commission should have strongand the Member States should have the necessary investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality, subsidiarity and the rights and interests of the affected parties.
2021/07/19
Committee: JURI
Amendment 340 #
Proposal for a regulation
Recital 99
(99) In particular, the Commission should have access to any relevant documents, data and information necessary, when acting pursuant to the powers granted under this Regulation, to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers
2021/07/19
Committee: JURI
Amendment 384 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an or activity which, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that lawcriminal, administrative or civil legal framework of a Member State;
2021/07/19
Committee: JURI
Amendment 418 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content if the content or activity is to be deemed illegal under Article 2 (g).
2021/07/19
Committee: JURI
Amendment 434 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union and national law, including those set out in this Regulation.
2021/07/19
Committee: JURI
Amendment 577 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. that are not controlled or owned by entities having their establishment outside the European Union.
2021/07/19
Committee: JURI
Amendment 603 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.deleted
2021/07/19
Committee: JURI
Amendment 632 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access and the duration;
2021/07/19
Committee: JURI
Amendment 645 #
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and that are not controlled or owned by entities having their establishment outside the European Union.
2021/07/19
Committee: JURI
Amendment 693 #
Proposal for a regulation
Article 18 – paragraph 2 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
2021/07/19
Committee: JURI
Amendment 699 #
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
3. If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
2021/07/19
Committee: JURI
Amendment 705 #
Proposal for a regulation
Article 19
[...]deleted
2021/07/19
Committee: JURI
Amendment 706 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.deleted
2021/07/19
Committee: JURI
Amendment 713 #
Proposal for a regulation
Article 19 – paragraph 2
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions: (a) competence for the purposes of detecting, identifying and notifying illegal content; (b) and is independent from any online platform; (c) purposes of submitting notices in a timely, diligent and objective manner.deleted it has particular expertise and it represents collective interests it carries out its activities for the
2021/07/19
Committee: JURI
Amendment 714 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content;deleted
2021/07/19
Committee: JURI
Amendment 718 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform;deleted
2021/07/19
Committee: JURI
Amendment 723 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner.deleted
2021/07/19
Committee: JURI
Amendment 729 #
Proposal for a regulation
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2.deleted
2021/07/19
Committee: JURI
Amendment 731 #
Proposal for a regulation
Article 19 – paragraph 4
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated.deleted
2021/07/19
Committee: JURI
Amendment 735 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.deleted
2021/07/19
Committee: JURI
Amendment 739 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flaggerdeleted
2021/07/19
Committee: JURI
Amendment 741 #
Proposal for a regulation
Article 19 – paragraph 7
7. The Commission, after consulting the Board, may issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 5 and 6.deleted
2021/07/19
Committee: JURI
Amendment 746 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonablespecified period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content that has been duly declared illegal as defined in Article 2(g). The online platform may request support from the Digital Service Coordinator to establish the frequency for which account suspension is deemed necessary and to set the duration of the suspension.
2021/07/19
Committee: JURI
Amendment 753 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonablespecified period of time and after having issued at least three prior warnings, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
2021/07/19
Committee: JURI
Amendment 758 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
2021/07/19
Committee: JURI
Amendment 763 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
3a. The assessment must be carried out by qualified staff provided with dedicated training on the applicable legal framework.
2021/07/19
Committee: JURI
Amendment 815 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
2021/07/19
Committee: JURI
Amendment 817 #
Proposal for a regulation
Article 23 – paragraph 2
2. Online platforms shall publish, at least once every sixtwelve months, information on the average monthly active recipients of the service in each Member State, calculated as an average over the period of the past sixtwelve months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
2021/07/19
Committee: JURI
Amendment 848 #
Proposal for a regulation
Article 25 – paragraph 4 – introductory part
4. The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipients of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission, the Commission and the national Digital Services Coordinators.
2021/07/19
Committee: JURI
Amendment 861 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a) the dissemination and amplification of illegal content through their services;
2021/07/19
Committee: JURI
Amendment 872 #
Proposal for a regulation
Article 26 – paragraph 1 – point c a (new)
(ca) copyright and intellectual property infringements and violations, pursuant to Article 17 of Directive 2019/790 on Copyright in the Digital Single Market.
2021/07/19
Committee: JURI
Amendment 875 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall also take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/07/19
Committee: JURI
Amendment 886 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) adaptchecking content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions;
2021/07/19
Committee: JURI
Amendment 892 #
Proposal for a regulation
Article 27 – paragraph 1 – point d
(d) initiating or adjusting cooperation with trusted flaggers in accordance with Article 19;deleted
2021/07/19
Committee: JURI
Amendment 904 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to mitigate the systemic risks identifideleted.
2021/07/19
Committee: JURI
Amendment 907 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.deleted
2021/07/19
Committee: JURI
Amendment 923 #
Proposal for a regulation
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non-compliance identified.
2021/07/19
Committee: JURI
Amendment 967 #
Proposal for a regulation
Article 31 – paragraph 1
1. Very large online platforms shall provide the requesting Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
2021/07/19
Committee: JURI
Amendment 981 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons:
2021/07/19
Committee: JURI
Amendment 987 #
Proposal for a regulation
Article 31 – paragraph 7 – subparagraph 1
The interested Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request.
2021/07/19
Committee: JURI
Amendment 992 #
Proposal for a regulation
Article 32 – paragraph 3 – point a
(a) cooperating with the Digital Services Coordinator of establishments and the Commission for the purpose of this Regulation;
2021/07/19
Committee: JURI
Amendment 994 #
Proposal for a regulation
Article 32 – paragraph 5
5. Very large online platforms shall communicatemake public the name and contact details of the compliance officer to the Digital Services Coordinator of establishment and the Commission.
2021/07/19
Committee: JURI
Amendment 1000 #
Proposal for a regulation
Article 33 – paragraph 2 – introductory part
2. In addition to the reports provided for in Article 13, very large online platforms shall make publicly available and transmit to the Digital Services Coordinator of establishments and the Commission, at least once a year and within 30 days following the adoption of the audit implementing report provided for in Article 28(4):
2021/07/19
Committee: JURI
Amendment 1005 #
Proposal for a regulation
Article 33 – paragraph 3
3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishments and the Commission, accompanied by a statement of the reasons for removing the information from the public reports.
2021/07/19
Committee: JURI
Amendment 1008 #
Proposal for a regulation
Article 34 – paragraph 1 – point b
(b) electronic submission of notices by trusted flaggers under Article 19, including through application programming interfaces;deleted
2021/07/19
Committee: JURI
Amendment 1017 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission, in agreement with the Board, may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/19
Committee: JURI
Amendment 1025 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.
2021/07/19
Committee: JURI
Amendment 1049 #
Proposal for a regulation
Article 38 – paragraph 3 a (new)
3a. Member States shall ensure that their Digital Services Coordinators are informed by the relevant national, local and regional authorities on the diversity of platform sectors and issues covered by this Regulation;
2021/07/19
Committee: JURI
Amendment 1052 #
Proposal for a regulation
Article 40 – paragraph 1
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation. With regard to very large online platforms that offer services in the Union, Member States where individuals or representative organisations received their services shall have jurisdiction.
2021/07/19
Committee: JURI
Amendment 1057 #
Proposal for a regulation
Article 40 – paragraph 4 a (new)
4a. The provisions of this article are without prejudice to the relevant consumer protection jurisdiction under the applicable Union and national law.
2021/07/19
Committee: JURI
Amendment 1068 #
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate and where providers are not qualified under Article 25, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
2021/07/19
Committee: JURI
Amendment 1081 #
Proposal for a regulation
Article 45 – paragraph 1 – subparagraph 1
Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may recommend the interested Digital Services Coordinator of establishments to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
2021/07/19
Committee: JURI
Amendment 1087 #
Proposal for a regulation
Article 45 – paragraph 3
3. The interested Digital Services Coordinator of establishment(s) shall take into utmost account the request or recommendation pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided.
2021/07/19
Committee: JURI
Amendment 1090 #
Proposal for a regulation
Article 45 – paragraph 4
4. The interested Digital Services Coordinator of establishment(s) shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation.
2021/07/19
Committee: JURI
Amendment 1091 #
Proposal for a regulation
Article 45 – paragraph 5
5. Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the requested Digital Services Coordinator of establishment(s), it may refer the matter to the Commission, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4.
2021/07/19
Committee: JURI
Amendment 1094 #
Proposal for a regulation
Article 45 – paragraph 6
6. The Commission shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the interested Digital Services Coordinator of establishment(s) and, unless it referred the matter itself, the Board.
2021/07/19
Committee: JURI
Amendment 1097 #
Proposal for a regulation
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment(s) to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request.
2021/07/19
Committee: JURI
Amendment 1102 #
Proposal for a regulation
Article 46 – paragraph 2
2. Where a Digital Services Coordinator of establishment has reasons to suspect that a very large online platform infringed this Regulation, it may request the Commission to take the necessary investigatory and enforcement measures to ensure compliance with this Regulation in accordance with Section 3. Such a request shall contain all information listed in Article 45(2) and set out the reasons for requesting the Commission to intervene.
2021/07/19
Committee: JURI
Amendment 1111 #
Proposal for a regulation
Article 50 – paragraph 1 – introductory part
1. Where thea Digital Services 1. Coordinator of establishment adopts a decision finding that a very large online platform has infringed any of the provisions of Section 4 of Chapter III, it shall make use of the enhanced supervision system laid down in this Article. It shall take utmost account of any opinion and recommendation of the Commission and the Board pursuant to this Article.
2021/07/19
Committee: JURI
Amendment 1113 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to thate interested Digital Services Coordinator(s) adopting such a decision within a reasonable time period.
2021/07/19
Committee: JURI
Amendment 1117 #
Proposal for a regulation
Article 50 – paragraph 2
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, thea Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35.
2021/07/19
Committee: JURI
Amendment 1118 #
Proposal for a regulation
Article 50 – paragraph 3 – introductory part
3. Within one month following receipt 3. of the action plan, the Board shall communicate its opinion on the action plan to the relevant Digital Services Coordinator of establishment. Within one month following receipt of that opinion, that Digital Services Coordinator shall decide whether the action plan is appropriate to terminate or remedy the infringement.
2021/07/19
Committee: JURI
Amendment 1120 #
Proposal for a regulation
Article 50 – paragraph 3 – subparagraph 1
Where thea Digital Services Coordinator of establishment has concerns on the ability of the measures to terminate or remedy the infringement, it may request the very large online platform concerned to subject itself to an additional, independent audit to assess the effectiveness of those measures in terminating or remedying the infringement. In that case, that platform shall send the audit report to that Digital Services Coordinator, the Commission and the Board within four months from the decision referred to in the first subparagraph. When requesting such an additional audit, the Digital Services Coordinator may specify a particular audit organisation that is to carry out the audit, at the expense of the platform concerned, selected on the basis of criteria set out in Article 28(2).
2021/07/19
Committee: JURI
Amendment 1121 #
Proposal for a regulation
Article 50 – paragraph 4 – introductory part
4. The interested Digital Services Coordinator of establishment shall communicate to the Commission, the Board and the very large online platform concerned its views as to whether the very large online platform has terminated or remedied the infringement and the reasons thereof. It shall do so within the following time periods, as applicable:
2021/07/19
Committee: JURI
Amendment 1122 #
Proposal for a regulation
Article 50 – paragraph 4 – subparagraph 1
Pursuant to that communication, the Digital Services Coordinator of establishment shall no longer be entitled to take any investigatory or enforcement measures in respect of the relevant conduct by the very large online platform concerned, without prejudice to Article 66 or any other measures that it may take at the request of the Commission.
2021/07/19
Committee: JURI
Amendment 1125 #
Proposal for a regulation
Article 51 – paragraph 1 – point a
(a) is suspected of having infringed any of the provisions of this Regulation and theany Digital Services Coordinator of establishment did not take anydid take investigatory or enforcement measures, pursuant to the request of the Commission referred to in Article 45(7), upon the expiry of the time period set in that request;
2021/07/19
Committee: JURI
Amendment 1126 #
Proposal for a regulation
Article 51 – paragraph 1 – point b
(b) is suspected of having infringed any of the provisions of this Regulation and thea Digital Services Coordinator of establishment requested the Commission to intervene in accordance with Article 46(2), upon the reception of that request;
2021/07/19
Committee: JURI
Amendment 1129 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
As regards points (a) and (b) of paragraph 1, pursuant to that notification, the Digital Services Coordinator of establishment concerned shall no longer be entitled to take any investigatory or enforcement measures in respect of the relevant conduct by the very large online platform concerned, without prejudice to Article 66 or any other measures that it may take at the request of the Commission.
2021/07/19
Committee: JURI
Amendment 1131 #
Proposal for a regulation
Article 54
1. assigned to it under this Section, the Commission may conduct on-site inspections at the premises of the very large online platform concerned or other person referred to in Article 52(1). 2.Article 54 deleted Power to conduct on-site inspections In order to carriedy out with the assistance of auditors or experts appointed by the Commission pursuant to Article 57(2). 3. Commission and auditors or experts appointed by it may require the very large online platform concerned or other person referred to in Article 52(1) to provide explanations on its organisation, functioning, IT system, algorithms, data- handling and business conducts. The Commission and auditors or experts appointed by it may address questions to key personnel of the very large online platform concerned or other person referred to in Article 52(1). 4. concerned or other person referred to in Article 52(1) is required to submit to an on-site inspection ordered by decision of the Commission. The decision shall specify the subject matter and purpose of the visit, set the date on which it is to begin and indicate the penalties provided for in Articles 59 and 60 and the right to have the decision reviewed by the Court of Justice of the European Union.the tasks On-site inspections may also be During on-site inspections the The very large online platform
2021/07/19
Committee: JURI
Amendment 1136 #
Proposal for a regulation
Article 58 – paragraph 1 – introductory part
1. The Commission shall adopt a non- compliance decision, after consulting the Board, where it finds that the very large online platform concerned does not comply with one or more of the following:
2021/07/19
Committee: JURI
Amendment 1137 #
Proposal for a regulation
Article 58 – paragraph 5
5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision, approved by the Board.
2021/07/19
Committee: JURI
Amendment 1139 #
Proposal for a regulation
Article 59 – paragraph 2 – point c
(c) refuse to submit to an on-site inspection pursuant to Article 54.deleted
2021/07/19
Committee: JURI
Amendment 1140 #
Proposal for a regulation
Article 59 – paragraph 3
3. Before adopting the decision pursuant to paragraph 2, the Commission shall communicate its preliminary findings to the very large online platform concerned or other person referred to in Article 52(1) and to the Board.
2021/07/19
Committee: JURI
Amendment 1141 #
Proposal for a regulation
Article 60 – paragraph 1 – point b
(b) submit to an on-site inspection which it has ordered by decision pursuant to Article 54;deleted
2021/07/19
Committee: JURI
Amendment 1142 #
Proposal for a regulation
Article 61 – paragraph 3 – point b
(b) on-site inspection;deleted
2021/07/19
Committee: JURI
Amendment 1143 #
Proposal for a regulation
Article 65 – paragraph 1 – introductory part
1. Where all powers pursuant to this Article to bring about the cessation of an infringement of this Regulation have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the Commission may request the interested Digital Services Coordinator of establishment ofto act pursuant to Article 41(3) towards the very large online platform concerned to act pursuant to Article 41(3).
2021/07/19
Committee: JURI