BETA

79 Amendments of Nathalie COLIN-OESTERLÉ related to 2020/0361(COD)

Amendment 240 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such asearch engines, social networks or online marketplaces, and live streaming platforms or private messaging providers should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/07/08
Committee: IMCO
Amendment 265 #
Proposal for a regulation
Recital 18 a (new)
(18a) The exemptions from liability established in this Regulation should not be available to providers of intermediary services that do not comply with the due diligence obligations in this Regulation. The conditionality should further ensure that the standards to qualify for such exemptions contribute to a high-level of safety and trust in the online environment.
2021/07/08
Committee: IMCO
Amendment 301 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union or national law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability set out in this Regulation. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/07/08
Committee: IMCO
Amendment 336 #
Proposal for a regulation
Recital 32
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. This information should include the relevant e-mail addresses, telephone numbers, IP addresses and other contact details necessary to ensure such compliance. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
2021/07/08
Committee: IMCO
Amendment 347 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear, effective and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 352 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10. The order may alternatively be drafted in the official language of the Member State whose authority issues the order against the specific item of illegal content; in such case, the point of contact is entitled upon request to atranscription, by said authority, into the language declared by the provider.
2021/06/10
Committee: LIBE
Amendment 359 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. , the effect given to the order and, where no effect has been given to the order, a statement of reasons explaining why information cannot be provided to the national judicial or administrative authority issuing the order.
2021/06/10
Committee: LIBE
Amendment 365 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective foraccording to which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules[, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
2021/06/10
Committee: LIBE
Amendment 380 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10;. The order may alternatively be drafted in the official language of the Member State whose authority issues the order against the specific item of illegal content; in such case, the point of contact is entitled upon request to a transcription, by said authority, into the language declared by the provider.
2021/06/10
Committee: LIBE
Amendment 381 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide, based on its own assessment, whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. It may also be possible for online platforms to prevent a content that has already been identified as illegal and that has been removed on the basis of a prior notice, from reappearing. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
2021/07/08
Committee: IMCO
Amendment 391 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should prevent future uploads of already notified illegal content resulting from a valid notice and action procedure and should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/07/08
Committee: IMCO
Amendment 409 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent, effective and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interesthave significant legitimate interest and a proven record in flagging illegal content with a high rate of accuracy and that they have demonstrated their competence in detecting, identifying and notifying illegal content or represent collective interests or general interest to prevent infringements of Union law or provide redress and that they work in a diligent and objective manner. Such entities can also be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non- governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry andindividual right-holders, their representatives, duly mandated third parties organisations of industry and other independent entities that have a specific expertise and act in the best interests of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The same should be granted to applicants within the meaning of Regulation (EU) No 608/2013 or in case of complaints pursuant to Regulation (EU) 2019/1020 so as to ensure that existing rules regarding custom enforcement or consumer protection are effectively implemented to online sale. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/08
Committee: IMCO
Amendment 424 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestlyor disseminating illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/08
Committee: IMCO
Amendment 464 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or, disable access to or otherwise restrict the visibility of specific items of information provided by the recipients of the service or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or removing or, disabling access to or reducing the visibility of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access or the restriction of visibility or the suspension or termination of monetization, of the decision and provide a clear and specific statement of reasons for that decision.
2021/06/10
Committee: LIBE
Amendment 471 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the restriction of the visibility of, or the demonetization of the information and, where relevant, the territorial scope of the disabling of access or the restriction;
2021/06/10
Committee: LIBE
Amendment 477 #
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1 (new)
2. When the removing or access disabling to specific items of information is followed by the transmission of these specific items of information in accordance with Article 15a, the information of the recipient mentioned in paragraph 1 is postponed by a period of six weeks in order not to interfere with potential ongoing criminal investigations. This period of six weeks can be renewed only after a motivated decision of the competent authority to which the specific items of information had been transmitted.
2021/06/10
Committee: LIBE
Amendment 481 #
Proposal for a regulation
Article 15 a (new)
Article 15 a Preservation of content and related data, and mandatory transmission of specific items of information 1. Providers of hosting services shall store the illegal content which has been removed or access to which has been disabled as a result of content moderation, or of an order to act against a specific item of illegal content as referred to in Article 8, as well as any related data removed as a consequence of the removal of such illegal content, which are necessary for: (a) administrative or judicial review proceedings or out-of-court dispute settlement against a decision to remove or disable access to illegal content and related data; or (b) the prevention, detection, investigation and prosecution of criminal offences. 2. The illegal content and related data, as referred to in paragraph 1, shall be stored for six months from the date of removal or disabling. The illegal content shall, upon request from the competent authority or court, be preserved for a further specified period only if and for as long as necessary for ongoing administrative or judicial review proceedings, as referred to in point (a) of paragraph 1. 3. Providers of hosting services shall ensure that the illegal content and related data stored pursuant to paragraph 1 are subject to appropriate technical and organisational safeguards. Those technical and organisational safeguards shall ensure that the illegal content and related data stored are accessed and processed only for the purposes referred to in paragraph 1, and ensure a high level of security of the personal data concerned. Providers of hosting services shall review and update those safeguards where necessary. 4. Providers of hosting services shall transmit to the competent authorities of the Member States the illegal content which has been removed or access to which has been disabled, whether such removing or disabling access is a result of a voluntary content moderation or of a use of the notification and action mechanism referred to in Article 14. This obligation of transmission applies under the following conditions: (a) illegal content referred to in this paragraph means content which is manifestly illegal and is an offense according to [Framework Decision 2008/913/JHA and Directive 2011/36/EU]; and (b) the competent law enforcement authority to which to transmit such illegal content is that of the Member State of the residence or establishment of the person who made the illegal content available, or, failing that, the law enforcement authority of the Member State in which the provider of hosting services is established or has its legal representative; or, failing that, the provider of hosting services shall inform Europol. (c) when the provider of hosting services is a very large online platform in accordance with section 4 of chapter III, it must also, when transmitting the illegal content, add an indicating flag for the illegal content which involve a threat to the life or safety of persons. 5. Each Member State shall notify to the European Commission and to the Council the list of its competent law enforcement authorities as referred to in paragraph 4.
2021/06/10
Committee: LIBE
Amendment 488 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service as well as individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the online platform not to act after having received a notice, and against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/06/10
Committee: LIBE
Amendment 493 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable access to or restrict the visibility of the information;
2021/06/10
Committee: LIBE
Amendment 496 #
1 a. decisions to restrict the ability to monetize content provided by the recipients.
2021/06/10
Committee: LIBE
Amendment 497 #
Proposal for a regulation
Article 17 – paragraph 1 b (new)
1 b. 2. When the decision to remove or disable access to the information is followed by the transmission of this information in accordance with Article 15a, the period of at least six months mentioned in paragraph 1 begins to start from the day on which the information was given to the recipient in accordance with Article15(2).
2021/06/10
Committee: LIBE
Amendment 509 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that recipients of the service are given the possibility, where necessary, to contact a human interlocutor at the time of the submission of the complaint and that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means.
2021/06/10
Committee: LIBE
Amendment 546 #
Proposal for a regulation
Recital 81
(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations as well as parties having a legitimate interest and meeting relevant criteria of expertise and independence from any online hosting services provider or platform should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross- border cooperation.
2021/07/08
Committee: IMCO
Amendment 576 #
Proposal for a regulation
Article 21 – title
21 15c. Notification of suspicions of criminal offences
2021/06/10
Committee: LIBE
Amendment 577 #
Proposal for a regulation
Article 21 – paragraph 1
1. Where an online platform provider of hosting services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.
2021/06/10
Committee: LIBE
Amendment 582 #
Proposal for a regulation
Article 21 – paragraph 2 – introductory part
2. Where the online platformprovider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol.
2021/06/10
Committee: LIBE
Amendment 584 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located. For the purpose of this Article, each Member State shall notify to the European Commission and to the Council the list of its competent law enforcement or judicial authorities.
2021/06/10
Committee: LIBE
Amendment 652 #
Proposal for a regulation
Article 2 – paragraph 1 – point c
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft or profession;
2021/07/08
Committee: IMCO
Amendment 660 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
— a significant number of users in one or more Member States compared to their total population; or
2021/07/08
Committee: IMCO
Amendment 675 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
— a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service and which does not have any active role in data processing;
2021/07/08
Committee: IMCO
Amendment 678 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3 a (new)
- an online platform as defined in point (h) of this Regulation;
2021/07/08
Committee: IMCO
Amendment 688 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of servicesillegal content, products, services or activity, is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
2021/07/08
Committee: IMCO
Amendment 699 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information and optimises its content, unless that activity is a minor and purely ancillary feature of anotherthe main service and, for objective and technical reasons cannot be used without that othermain service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.;
2021/07/08
Committee: IMCO
Amendment 707 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(ha) ‘online marketplace’ means an online platform allowing consumers to conclude distance contracts with traders;
2021/07/08
Committee: IMCO
Amendment 708 #
Proposal for a regulation
Article 2 – paragraph 1 – point h b (new)
(hb) ‘live streaming platform service’ means an information society service the main or one of the main purposes of which is to give the public access to audio or video material that is broadcasted live by its users, which it organises and promotes for profit-making purposes;
2021/07/08
Committee: IMCO
Amendment 709 #
Proposal for a regulation
Article 2 – paragraph 1 – point h c (new)
(hc) ‘private messaging service’ means a number-independent interpersonal communications service as defined in Article 2(7) of Directive (EU) 2018/1972, excluding transmission of electronic mail as defined in Article 2(h) of Directive 2002/58/EC.
2021/07/08
Committee: IMCO
Amendment 711 #
Proposal for a regulation
Article 2 – paragraph 1 – point i
(i) ‘dissemination to the public’ means making information available, at the request of the recipient of the service who provided the information, to a significant and potentially unlimited number of third parties;
2021/07/08
Committee: IMCO
Amendment 723 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services, regardless of whether they are automated or processed by a person, which are aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
2021/07/08
Committee: IMCO
Amendment 758 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously toand permanently removes or to disables access to the illegal content; expeditiously means immediately or as fast as possible and in any event no later than within 30 minutes where the illegal content pertains to the broadcast of a live sports or entertainment event.
2021/07/08
Committee: IMCO
Amendment 775 #
Proposal for a regulation
Article 5 – paragraph 3 a (new)
3a. Paragraph 1 shall not apply when the provider of intermediary services engages in illegal activities.
2021/07/08
Committee: IMCO
Amendment 784 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set outwhen they engage in or facilitate illegal activities or when they do not comply with the due diligence obligations laid down in this Regulation.
2021/07/08
Committee: IMCO
Amendment 791 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Paragraph 1 shall apply only when intermediary services are compliant with due diligence obligations laid down in this Regulation.
2021/07/08
Committee: IMCO
Amendment 834 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10; upon a decision by a Member State an order may be drafted in the official language of the Member State whose authority issued the order against the specific item of illegal content; in such case, the point of contact shall be entitled, upon request, to a transcription by that authority into the language declared by the provider.
2021/07/08
Committee: IMCO
Amendment 859 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. Where no effect has been given to the order, a statement shall explain the reasons why the information cannot be provided to the national judicial or administrative authority that issued the order.
2021/07/08
Committee: IMCO
Amendment 873 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective foraccording to which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for official reasons related to the prevention, investigation, detection and prosecution of criminal offences;
2021/07/08
Committee: IMCO
Amendment 877 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10;. Upon a decision by a Member State, the order may be drafted in the official language of the Member State whose authority issued the order against the specific item of illegal content, In such case, the point of contact shall be entitled, upon request, to a transcription by that authority into the language declared by the provider.
2021/07/08
Committee: IMCO
Amendment 898 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with Member States’ authorities, the Commissiwhich do not have an establishment in the Union but which offer services in the Union shall designate, for those already existing as soon as possible, for those to be established prior to the establishment, in writing, a legal or natural person ands the Board referred to in Article 47 for the application of this Regulationir legal representative in one of the Member States where the provider offers its services.
2021/07/08
Committee: IMCO
Amendment 915 #
Proposal for a regulation
Article 11 – paragraph 2
2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource tos in order to guarantee their proper and timely cooperateion with the Member States’ authorities, the Commission and the Board and complyiance with those decisions.
2021/07/08
Committee: IMCO
Amendment 919 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
5a. Providers of intermediary services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to obtain a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation.
2021/07/08
Committee: IMCO
Amendment 1012 #
Proposal for a regulation
Article 13 a (new)
Article 13a Trusted flaggers 1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided immediately, without prejudice to the implementation of a complaint and redress mechanism. 2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions, without prejudice to the implementation of a complaint and redress mechanism: (a) it has particular expertise and competence, for the purposes of detecting, identifying and notifying illegal content; (b) it represents collective interests including general interest to prevent or provide redress for infringements of Union law and is independent from any online platform; (c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner, and it is independent. 3. The conditions set in paragraph 2 shall allow trusted flaggers’ notifications to be sufficient for immediate removal or disabling of the content notified by them. 4. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. 5. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. 6. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices, or notices aimed at distorting competition, through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. 7. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. The Digital Services Coordinator may take into account any evidence according to which the entity would have used its status to distort competition. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger. 8. The Commission, after consulting the Board, may issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 6 and 7.
2021/07/08
Committee: IMCO
Amendment 1021 #
Proposal for a regulation
Chapter III – Section 2 – title
Additional provisions applicable to providers of hosting services, including online platforms, and to providers of livestreaming platform services and of private messaging services
2021/07/08
Committee: IMCO
Amendment 1025 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting servicesivate messaging services and providers of hosting services, including online platforms, shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, clearly visible, low-threshold, user- friendly, and located close to the content in question allowing for the submission of notices exclusively by electronic means.
2021/07/08
Committee: IMCO
Amendment 1050 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal conten enabling the identification of the illegal content if the application of the service that is used by the recipient allows it;
2021/07/08
Committee: IMCO
Amendment 1073 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. , including online platforms, and of private messaging services, without prejudice to Article 5(1), point (b), shall process any notices that they receive under the mechanisms referred to in paragraph 1, of this Article, and remove or disable access to the illegal content without undue delay and within seven days of the receipt of the notification at the latest. Resulting from a valid notice and action procedure, providers of hosting services shall prevent future uploads of already notified illegal content putting in place effective, reasonable and proportionate measures.
2021/07/08
Committee: IMCO
Amendment 1083 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Providers of hosting service shall, without undue delay and within seven days of the receipt of the notification at the latest, inform consumers who have purchased illegal products between the moment they have been uploaded on the provider’s website and the moment the listing has been taken down by the platform following a valid notice.
2021/07/08
Committee: IMCO
Amendment 1100 #
Proposal for a regulation
Article 15 – paragraph 2 – introductory part
2. When the removing or disabling access to specific items of information is followed by the transmission of those specific items of information in accordance with Article 15a, the provision of information to the recipient in accordance with paragraph 1 shall be postponed for a period of six weeks in order not to interfere with potential ongoing criminal investigations. That period of six weeks may be renewed only after a motivated decision of the competent authority to which the specific items of information had been transmitted. The statement of reasons referred to in paragraph 1 shall at least contain the following information:
2021/07/08
Committee: IMCO
Amendment 1103 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the restriction of the visibility of, or the demonetisation of the information and, where relevant, the territorial scope of the disabling of access or of the restriction of visibility;
2021/07/08
Committee: IMCO
Amendment 1126 #
Proposal for a regulation
Article 15 a (new)
Article 15a Preservation of content and related data, and mandatory transmission of specific items of information 1. Providers of hosting services shall store the illegal content which has been removed or access to which has been disabled as a result of content moderation, or of an order to act against a specific item of illegal content as referred to in Article 8, as well as any related data removed as a consequence of the removal of such illegal content, which are necessary for: (a) administrative or judicial review or out-of-court dispute settlement against a decision to remove or disable access to illegal content and related date; or (b) the prevention, detection, investigation and prosecution of criminal offences. 2. Providers of hosting services shall store the illegal content and related data pursuant to in paragraph 1 for six months from the date of removal or disabling access to it. The illegal content shall, upon request from the competent authority or court, be stored for a further specified period only if and for as long as necessary for ongoing administrative or judicial review as referred to in paragraph 1, point (a). 3. Providers of hosting services shall ensure that the illegal content and related data stored pursuant to paragraph 1 are subject to appropriate technical and organisational safeguards. Those technical and organisational safeguards shall ensure that the illegal content and related data stored are accessed and processed only for the purposes referred to in paragraph 1 and shall ensure a high level of security of personal data concerned. Providers of hosting services shall review and update those safeguards where necessary. 4. Providers of hosting services shall transmit to the competent authorities of the Member States the illegal content which has been removed or access to which has been disabled, whether such a removing or disabling access to is a result of a voluntary content moderation or of a use of the notice and action mechanism referred to in Article 14. They shall transmit that illegal content under the following conditions: (a) illegal content referred to in this paragraph means content which is manifestly illegal and is an offence in accordance with Council Framework Decision 2008/913/JHA1a and Directive 2011/36/EU of the European Parliament and of the Council1b; and (b) the competent law enforcement authority to receive such illegal content is that of the Member State of the residence or establishment of the person who made the illegal content available, or, failing that, the law enforcement authority is that of the Member State in which the provider of hosting services is established or has its legal representative, or, failing that, the provider of hosting services shall inform Europol; (c) when the provider of hosting services is a very large online platform in accordance with the Section 4 of Chapter III, it shall, when transmitting the illegal content, add a flag indicating that the illegal content involves a threat to the life or safety of persons. 5. Each Member State shall notify to the Commission the list of its competent law enforcement authorities for the purposes of paragraph 4.
2021/07/08
Committee: IMCO
Amendment 1133 #
Proposal for a regulation
Article 15 b (new)
Article 15b Notification of suspicions of serious criminal offences 1. Where a provider of hosting services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 2. Where provider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or shall inform Europol. For the purpose of this Article, the Member State concerned shall be the Member State where the serious criminal offence is suspected to have taken place, to be taking place or to likely take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected serious criminal offence resides or is located. For the purpose of this Article, each Member State shall notify to the Commission the list of its competent law enforcement or judicial authorities.
2021/07/08
Committee: IMCO
Amendment 1146 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, as well as individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the online platform not to act after having received a notice, and against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/07/08
Committee: IMCO
Amendment 1156 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable access to or restrict the visibility of the information;
2021/07/08
Committee: IMCO
Amendment 1168 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to restrict the ability to monetise content provided by the recipients.
2021/07/08
Committee: IMCO
Amendment 1179 #
Proposal for a regulation
Article 17 – paragraph 2 – subparagraph 1 a (new)
When the decision to remove or disable access to the information is followed by the transmission of this information in accordance with Article 15a, the period of at least six months referred to in paragraph 1 of this Article begins on the day on which the information was given to the recipient in accordance with Article 15.
2021/07/08
Committee: IMCO
Amendment 1181 #
Proposal for a regulation
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner and without undue delay and at the latest within seven days of the notification. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
2021/07/08
Committee: IMCO
Amendment 1259 #
Proposal for a regulation
Article 19
[...]deleted
2021/07/08
Committee: IMCO
Amendment 1323 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall, after having issued a prior warning, suspend, for a reasonable period of time and after having issued a prior warning,, or terminate the provision of their services to recipients of the service that frequentpeatedly provide manifestly illegal content.
2021/07/08
Committee: IMCO
Amendment 1332 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platforms shall, after having issued a prior warning, suspend, for a reasonable period of time and after having issued a prior warning,, or terminate the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
2021/07/08
Committee: IMCO
Amendment 1527 #
Proposal for a regulation
Article 25 – title
Very large online platforms, live streaming platforms, private messaging providers and search engines
2021/07/08
Committee: IMCO
Amendment 1532 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platform services, live streaming platform services, private messaging services and search engine services which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
2021/07/08
Committee: IMCO
Amendment 1548 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platform services, live streaming platform services, private messaging services and search engine services shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1599 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platform services, live streaming platform services, private messaging services and search engine services shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1904 #
Proposal for a regulation
Article 37 – paragraph 5
5. If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it mayshall request the participants to remove and, where necessary, revise the crisis protocol, including by taking additional measures.
2021/07/08
Committee: IMCO
Amendment 1907 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 1
2. Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. Those competent authorities shall have the same powers to carry out the tasks or supervise the sectors assigned to them as those attributed to the Digital Services Coordinator for the application and enforcement of this Regulation. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union.
2021/07/08
Committee: IMCO
Amendment 1927 #
Proposal for a regulation
Article 40 – paragraph 1
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation, Sections 1 to 4, as well as Chapter IV.
2021/07/08
Committee: IMCO
Amendment 1930 #
Proposal for a regulation
Article 40 – paragraph 1 a (new)
1a. The Member State where the consumers have their habitual residence shall have jurisdiction for the purposes of Chapter III, Section 3.
2021/07/08
Committee: IMCO
Amendment 1931 #
Proposal for a regulation
Article 40 – paragraph 1 b (new)
1b. The Member State where the authority issuing the order is situated shall have jurisdiction for the purposes of Articles 8 and 9.
2021/07/08
Committee: IMCO
Amendment 1964 #
Proposal for a regulation
Article 42 a (new)
Article 42a In accordance with the conditional exemption from liability laid down in Article 1(1)(a), Member States shall ensure that the penalty for repeatedly failing to comply with the obligations under this Regulation includes the horizontal loss of the liability exemption for the intermediary service provider.
2021/07/08
Committee: IMCO
Amendment 1968 #
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service, as well as other parties having a legitimate interest and meeting relevant criteria of expertise and independence from any online hosting services provider or platform shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
2021/07/08
Committee: IMCO
Amendment 2066 #
Proposal for a regulation
Article 48 – paragraph 5
5. The Board may invite experts and observers to attend its meetings, and mayshall cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available.
2021/07/08
Committee: IMCO