Activities of Geoffroy DIDIER related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (debate)
Opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Amendments (121)
Amendment 97 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. When provider of intermediary service promotes, references the content, the exemption from liability established in this Regulation should not apply to it.
Amendment 101 #
Proposal for a regulation
Recital 20
Recital 20
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertakeengages in illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
Amendment 108 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well asThis Regulation should not affect Member States’ freedom to regulate issues on which those other acts leave Member States the possibility of adopting certain, measures at national level. In the event of a conflict between Directive 2010/13/EU as amended and this Regulation, Directive 2010/13/EU as well as the national measures taken in accordance with that Directive should prevail. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 125 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear, effective and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 136 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider shall prevent future uploads of already notified illegal content resulting from a valid notice and action procedure and should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 147 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent, effective and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they have significant legitimate interest and a proven record in flagging illegal content with a high rate of accuracy and that they have demonstrated their competence in detecting,identifying and notifying illegal content or represent collective interests and that they work in a diligent and objective manner. Such entities can also be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non- governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry andindividual right-holders, their representatives, duly mandated third parties organisations of industry and other independent entities that have a specific expertise and act in the best interests of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 155 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestlyor disseminating illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 156 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers and other users, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from sellinghe selling and dissemination of products orand services in violation of the applicable rules, online platforms allowing consum all providers of intermediary services, including hosting providers, domain name registrars, providers tof conclude distance contracts with tratent delivery networks, proxy and reverse proxy providers, online marketplaces, online payment service providers and online advertising service providers should ensure that such tradtheir business customers are traceable. The tradbusiness customer should therefore be required to provide certain essential information to the online platform or provider of intermediary services, including for purposes of promoting messages on or offering products. That requirement should also be applicable to tradbusiness customers that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platformProviders of intermediary services should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed and verified, in accordance with the applicable law, including on the protection of personal data, by the providers of intermediary services,public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 159 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covproviders of intermediary services should make reasonable efforts to verify the reliability of the information provided by the traders concernedir business customers, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concernedir business customers to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covproviders of intermediary services should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platformproviders of intermediary services, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability and accuracy of the information towards consumer or other interested parties. Such online platforms sproviders of intermediary services should update the information they hould on a risk-sensitive basis, and at least once a year and also design and organise their online interface in a way that enables tradheir business customers to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 197 #
Proposal for a regulation
Recital 81
Recital 81
(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations as well as parties having a legitimate interest and meeting relevant criteria of expertise and independence from any online hosting services provider or platform should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross- border cooperation.
Amendment 224 #
Proposal for a regulation
Article 2 – paragraph 1 – point e a (new)
Article 2 – paragraph 1 – point e a (new)
(e a) ‘business customer’ means: - legal entities, except any entity which qualifies as a large undertaking as defined in Article3(4) of Directive 2013/34 of the European Parliament and the Council; - any natural person that purchases a type or amount of service indicative of, or otherwise indicates, the intent to operate a business online or contracts for the purchase of more than €10.000 of services provided by the intermediary service provider in a one-year period;
Amendment 240 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such asearch engines, social networks or online marketplaces, and live streaming platforms or private messaging providers should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
Amendment 258 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and, automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. The provider of intermediary services is considered to play an active role when it optimises, promotes, classifies, organises and references the content, regardless of whether this is automated or not.
Amendment 260 #
Proposal for a regulation
Article 5 – paragraph 3 a (new)
Article 5 – paragraph 3 a (new)
3 a. Paragraph 1 shall not apply when the provider of intermediary services engages in illegal activities.
Amendment 265 #
Proposal for a regulation
Recital 18 a (new)
Recital 18 a (new)
(18a) The exemptions from liability established in this Regulation should not be available to providers of intermediary services that do not comply with the due diligence obligations in this Regulation. The conditionality should further ensure that the standards to qualify for such exemptions contribute to a high-level of safety and trust in the online environment.
Amendment 268 #
Proposal for a regulation
Recital 20
Recital 20
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undere main purpose of which is to engage in or facilitakte illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
Amendment 288 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain pProviders of hosting services, namely,such as online platforms that allow consumers to conclude distance contracts with traders, and other service providers should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platformsey present the relevant information relating to the transactions or exchanges at issue in such a way that it leads consumers to believe that the information was provided by those online platformhosting service providers themselves or by recipients of the service acting under their authority or control, and that those online platformhosting service providers thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
Amendment 299 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Amendment 301 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union or national law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability set out in this Regulation. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 316 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligationMember States are prevented from imposing a monitoring obligation on service providers only with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact- finding obligation, or as a general obligation forimpeding providers tofrom takeing proactive measures to relation to illegal contentidentify and remove illegal content and to prevent that it reappears.
Amendment 336 #
Proposal for a regulation
Recital 32
Recital 32
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. This information should include the relevant e-mail addresses, telephone numbers, IP addresses and other contact details necessary to ensure such compliance. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
Amendment 338 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Resulting from a valid notice and action procedure, providers of hosting services shall prevent future uploads of already notified illegal content putting in place effective, reasonable and proportionate measures. Where they use automated means for that processing or decision- making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 346 #
Proposal for a regulation
Recital 106 a (new)
Recital 106 a (new)
(106 a)In order to promote the freedom of expression and media pluralism online, the importance of editorial content and services must be recognised, requiring intermediary service providers to refrain from removing, suspending or disabling access to it. It follows that intermediary service providers should be exempt from liability for editorial content and services. Intermediary service providers should put mechanisms in place to facilitate the practical application, for example, the flagging of lawful editorial content and services by content providers. Providers of editorial content and services should be identified by the Member State in which the provider is established. These providers should be understood as performing an economic activity within the meaning of Articles 56 and 57 TFEU.
Amendment 347 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear, effective and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 352 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10. The order may alternatively be drafted in the official language of the Member State whose authority issues the order against the specific item of illegal content; in such case, the point of contact is entitled upon request to atranscription, by said authority, into the language declared by the provider.
Amendment 359 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. , the effect given to the order and, where no effect has been given to the order, a statement of reasons explaining why information cannot be provided to the national judicial or administrative authority issuing the order.
Amendment 361 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 365 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective foraccording to which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules[, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
Amendment 368 #
Proposal for a regulation
Article 1 a (new)
Article 1 a (new)
Article 1 a Contractual provisions 1. Any contractual provisions between an intermediary service provider and a trader, business user, or a recipient of its service which are contrary to this Regulation shall be unenforceable. 2. This Regulation shall apply irrespective of the law applicable to contracts concluded between providers of intermediary services and a recipient of the service, a consumer, a trader or business user.
Amendment 380 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10;. The order may alternatively be drafted in the official language of the Member State whose authority issues the order against the specific item of illegal content; in such case, the point of contact is entitled upon request to a transcription, by said authority, into the language declared by the provider.
Amendment 381 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide, based on its own assessment, whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. It may also be possible for online platforms to prevent a content that has already been identified as illegal and that has been removed on the basis of a prior notice, from reappearing. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 391 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should prevent future uploads of already notified illegal content resulting from a valid notice and action procedure and should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 399 #
Proposal for a regulation
Article 19
Article 19
Amendment 409 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent, effective and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interesthave significant legitimate interest and a proven record in flagging illegal content with a high rate of accuracy and that they have demonstrated their competence in detecting, identifying and notifying illegal content or represent collective interests or general interest to prevent infringements of Union law or provide redress and that they work in a diligent and objective manner. Such entities can also be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non- governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry andindividual right-holders, their representatives, duly mandated third parties organisations of industry and other independent entities that have a specific expertise and act in the best interests of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The same should be granted to applicants within the meaning of Regulation (EU) No 608/2013 or in case of complaints pursuant to Regulation (EU) 2019/1020 so as to ensure that existing rules regarding custom enforcement or consumer protection are effectively implemented to online sale. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 424 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestlyor disseminating illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 434 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers and other users, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling he selling and dissemination of products orand services in violation of the applicable rules, online platforms allowing consum all providers of intermediary services, including hosting providers, domain name registrars, providers tof conclude distance contracts with tratent delivery networks, proxy and reverse proxy providers, online marketplaces, online payment service providers and online advertising service providers should ensure that such tradtheir business customers are traceable. The tradbusiness customer should therefore be required to provide certain essential information to the online platform or provider of intermediary services, including for purposes of promoting messages on or offering products. That requirement should also be applicable to tradbusiness customers that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platformProviders of intermediary services should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed and verified, in accordance with the applicable law, including on the protection of personal data, by the providers of intermediary services, public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 438 #
Proposal for a regulation
Article 22
Article 22
Amendment 444 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covproviders of intermediary services should make reasonable efforts to verify the reliability of the information provided by the traders concernedir business customers, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concernedir business customers to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covproviders of intermediary services should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platformproviders of intermediary services, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability and accuracy of the information towards consumer or other interested parties. Such online platforms shoulproviders of intermediary services should update the information they hold on a risk-sensitive basis, and at least once a year and also design and organise their online interface in a way that enables tradheir business customers to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . __________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 464 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or, disable access to or otherwise restrict the visibility of specific items of information provided by the recipients of the service or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or removing or, disabling access to or reducing the visibility of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access or the restriction of visibility or the suspension or termination of monetization, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 471 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the restriction of the visibility of, or the demonetization of the information and, where relevant, the territorial scope of the disabling of access or the restriction;
Amendment 477 #
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1 (new)
Article 15 – paragraph 2 – subparagraph 1 (new)
2. When the removing or access disabling to specific items of information is followed by the transmission of these specific items of information in accordance with Article 15a, the information of the recipient mentioned in paragraph 1 is postponed by a period of six weeks in order not to interfere with potential ongoing criminal investigations. This period of six weeks can be renewed only after a motivated decision of the competent authority to which the specific items of information had been transmitted.
Amendment 481 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 488 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service as well as individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the online platform not to act after having received a notice, and against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 493 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable access to or restrict the visibility of the information;
Amendment 497 #
Proposal for a regulation
Article 17 – paragraph 1 b (new)
Article 17 – paragraph 1 b (new)
1 b. 2. When the decision to remove or disable access to the information is followed by the transmission of this information in accordance with Article 15a, the period of at least six months mentioned in paragraph 1 begins to start from the day on which the information was given to the recipient in accordance with Article15(2).
Amendment 509 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that recipients of the service are given the possibility, where necessary, to contact a human interlocutor at the time of the submission of the complaint and that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means.
Amendment 544 #
Proposal for a regulation
Recital 80
Recital 80
(80) Member States should ensure that violations of the obligations laid down in this Regulation can be sanctioned in a manner that is effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the infringer. In particular, penalties should take into account whether the provider of intermediary services concerned systematically or recurrently fails to comply with its obligations stemming from this Regulation, as well as, where relevant, whether the provider is active in several Member States. The Digital Service Coordinator should have the power to request the relevant judicial authority to take meaningful action when the provider of intermediary services has repeatedly infringed the obligations laid down in the Regulation.
Amendment 546 #
Proposal for a regulation
Recital 81
Recital 81
(81) In order to ensure effective enforcement of this Regulation, individuals or representative organisations as well as parties having a legitimate interest and meeting relevant criteria of expertise and independence from any online hosting services provider or platform should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross- border cooperation.
Amendment 576 #
Proposal for a regulation
Article 21 – title
Article 21 – title
21 15c. Notification of suspicions of criminal offences
Amendment 577 #
Proposal for a regulation
Article 21 – paragraph 1
Article 21 – paragraph 1
1. Where an online platform provider of hosting services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.
Amendment 582 #
Proposal for a regulation
Article 21 – paragraph 2 – introductory part
Article 21 – paragraph 2 – introductory part
2. Where the online platformprovider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol.
Amendment 583 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13a Providers of intermediary services shall ensure that the identity, such as the trademark/logo or other characteristic traits, of the business user providing the goods or services on the intermediary services is clearly visible alongside the goods or services offered.
Amendment 584 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located. For the purpose of this Article, each Member State shall notify to the European Commission and to the Council the list of its competent law enforcement or judicial authorities.
Amendment 607 #
Proposal for a regulation
Article 14 – paragraph 3 a (new)
Article 14 – paragraph 3 a (new)
3a. Notices that concern content and services of media service providers identified pursuant to Article 11a paragraph 2a shall be processed and resolved within existing internal complaints mechanisms of media service providers and include the possibility to seize the competent national judicial or regulatory authority or supervisory body. Content and services of media service providers shall remain available on hosting services until notices are resolved.
Amendment 652 #
Proposal for a regulation
Article 2 – paragraph 1 – point c
Article 2 – paragraph 1 – point c
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft or profession;
Amendment 660 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
Article 2 – paragraph 1 – point d – indent 1
— a significant number of users in one or more Member States compared to their total population; or
Amendment 670 #
Proposal for a regulation
Article 2 – paragraph 1 – point e a (new)
Article 2 – paragraph 1 – point e a (new)
(ea) ‘business customer’ means: - legal entities, except any entity which qualifies as a large undertaking as defined in Article 3(4) of Directive 2013/34 of the European Parliament and the Council; - any natural person that purchases a type or amount of service indicative of, or otherwise indicates, the intent to operate a business online or contracts for the purchase of more than €10,000 of services provided by the intermediary service provider in a one-year period;
Amendment 672 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – introductory part
Article 2 – paragraph 1 – point f – introductory part
(f) ‘intermediary service’ means one of the following information society services:
Amendment 675 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
Article 2 – paragraph 1 – point f – indent 3
— a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service and which does not have any active role in data processing;
Amendment 678 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3 a (new)
Article 2 – paragraph 1 – point f – indent 3 a (new)
- an online platform as defined in point (h) of this Regulation;
Amendment 687 #
Proposal for a regulation
Article 43 – paragraph 1
Article 43 – paragraph 1
Recipients of the service, as well as other parties having a legitimate interest and meeting relevant criteria of expertise and independence from any online hosting services provider or platform shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
Amendment 688 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of servicesillegal content, products, services or activity, is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 699 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information and optimises its content, unless that activity is a minor and purely ancillary feature of anotherthe main service and, for objective and technical reasons cannot be used without that othermain service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.;
Amendment 707 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
Article 2 – paragraph 1 – point h a (new)
(ha) ‘online marketplace’ means an online platform allowing consumers to conclude distance contracts with traders;
Amendment 708 #
Proposal for a regulation
Article 2 – paragraph 1 – point h b (new)
Article 2 – paragraph 1 – point h b (new)
(hb) ‘live streaming platform service’ means an information society service the main or one of the main purposes of which is to give the public access to audio or video material that is broadcasted live by its users, which it organises and promotes for profit-making purposes;
Amendment 709 #
Proposal for a regulation
Article 2 – paragraph 1 – point h c (new)
Article 2 – paragraph 1 – point h c (new)
(hc) ‘private messaging service’ means a number-independent interpersonal communications service as defined in Article 2(7) of Directive (EU) 2018/1972, excluding transmission of electronic mail as defined in Article 2(h) of Directive 2002/58/EC.
Amendment 711 #
Proposal for a regulation
Article 2 – paragraph 1 – point i
Article 2 – paragraph 1 – point i
(i) ‘dissemination to the public’ means making information available, at the request of the recipient of the service who provided the information, to a significant and potentially unlimited number of third parties;
Amendment 723 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services, regardless of whether they are automated or processed by a person, which are aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
Amendment 758 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously toand permanently removes or to disables access to the illegal content; expeditiously means immediately or as fast as possible and in any event no later than within 30 minutes where the illegal content pertains to the broadcast of a live sports or entertainment event.
Amendment 775 #
Proposal for a regulation
Article 5 – paragraph 3 a (new)
Article 5 – paragraph 3 a (new)
3a. Paragraph 1 shall not apply when the provider of intermediary services engages in illegal activities.
Amendment 784 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set outwhen they engage in or facilitate illegal activities or when they do not comply with the due diligence obligations laid down in this Regulation.
Amendment 791 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Article 6 – paragraph 1 a (new)
Paragraph 1 shall apply only when intermediary services are compliant with due diligence obligations laid down in this Regulation.
Amendment 834 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10; upon a decision by a Member State an order may be drafted in the official language of the Member State whose authority issued the order against the specific item of illegal content; in such case, the point of contact shall be entitled, upon request, to a transcription by that authority into the language declared by the provider.
Amendment 841 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Article 24a Recommender systems 1. Online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. 2. Where several options are available pursuant to paragraph 1, online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
Amendment 859 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. Where no effect has been given to the order, a statement shall explain the reasons why the information cannot be provided to the national judicial or administrative authority that issued the order.
Amendment 873 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective foraccording to which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for official reasons related to the prevention, investigation, detection and prosecution of criminal offences;
Amendment 877 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10;. Upon a decision by a Member State, the order may be drafted in the official language of the Member State whose authority issued the order against the specific item of illegal content, In such case, the point of contact shall be entitled, upon request, to a transcription by that authority into the language declared by the provider.
Amendment 898 #
Proposal for a regulation
Article 10 – paragraph 1
Article 10 – paragraph 1
1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with Member States’ authorities, the Commissiwhich do not have an establishment in the Union but which offer services in the Union shall designate, for those already existing as soon as possible, for those to be established prior to the establishment, in writing, a legal or natural person ands the Board referred to in Article 47 for the application of this Regulationir legal representative in one of the Member States where the provider offers its services.
Amendment 915 #
Proposal for a regulation
Article 11 – paragraph 2
Article 11 – paragraph 2
2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource tos in order to guarantee their proper and timely cooperateion with the Member States’ authorities, the Commission and the Board and complyiance with those decisions.
Amendment 919 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
Article 11 – paragraph 5 a (new)
5a. Providers of intermediary services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to obtain a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation.
Amendment 926 #
Proposal for a regulation
Article 29
Article 29
Amendment 927 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of informaensure that their terms and conditions provided byhibit the recipients of their service, in their terms and conditions. Thats from providing information sthall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible formatt is not in compliance with Union law or the law of the Member State where such information is made available.
Amendment 945 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act inensure that any a ddiligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the servicetional restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service are designed with due regard to the fundamental rights as enshrined in the Charter.
Amendment 946 #
Proposal for a regulation
Article 12 – paragraph 2 – subparagraph 1 a (new)
Article 12 – paragraph 2 – subparagraph 1 a (new)
Providers of intermediary services shall enforce the additional restrictions referred to in the first subparagraph in a diligent, objective and proportionate manner, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 965 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Amendment 1012 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 1021 #
Proposal for a regulation
Chapter III – Section 2 – title
Chapter III – Section 2 – title
Additional provisions applicable to providers of hosting services, including online platforms, and to providers of livestreaming platform services and of private messaging services
Amendment 1025 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting servicesivate messaging services and providers of hosting services, including online platforms, shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, clearly visible, low-threshold, user- friendly, and located close to the content in question allowing for the submission of notices exclusively by electronic means.
Amendment 1050 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal conten enabling the identification of the illegal content if the application of the service that is used by the recipient allows it;
Amendment 1073 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. , including online platforms, and of private messaging services, without prejudice to Article 5(1), point (b), shall process any notices that they receive under the mechanisms referred to in paragraph 1, of this Article, and remove or disable access to the illegal content without undue delay and within seven days of the receipt of the notification at the latest. Resulting from a valid notice and action procedure, providers of hosting services shall prevent future uploads of already notified illegal content putting in place effective, reasonable and proportionate measures.
Amendment 1083 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Providers of hosting service shall, without undue delay and within seven days of the receipt of the notification at the latest, inform consumers who have purchased illegal products between the moment they have been uploaded on the provider’s website and the moment the listing has been taken down by the platform following a valid notice.
Amendment 1095 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or, disable access to or otherwise restrict the visibility of specific items of information provided by the recipients of the service, or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or removing or, disabling access to or reducing the visibility of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access or the restriction of visibility or the suspension or termination of monetization, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 1100 #
Proposal for a regulation
Article 15 – paragraph 2 – introductory part
Article 15 – paragraph 2 – introductory part
2. When the removing or disabling access to specific items of information is followed by the transmission of those specific items of information in accordance with Article 15a, the provision of information to the recipient in accordance with paragraph 1 shall be postponed for a period of six weeks in order not to interfere with potential ongoing criminal investigations. That period of six weeks may be renewed only after a motivated decision of the competent authority to which the specific items of information had been transmitted. The statement of reasons referred to in paragraph 1 shall at least contain the following information:
Amendment 1103 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the restriction of the visibility of, or the demonetisation of the information and, where relevant, the territorial scope of the disabling of access or of the restriction of visibility;
Amendment 1126 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 1133 #
Proposal for a regulation
Article 15 b (new)
Article 15 b (new)
Article 15b Notification of suspicions of serious criminal offences 1. Where a provider of hosting services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 2. Where provider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or shall inform Europol. For the purpose of this Article, the Member State concerned shall be the Member State where the serious criminal offence is suspected to have taken place, to be taking place or to likely take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected serious criminal offence resides or is located. For the purpose of this Article, each Member State shall notify to the Commission the list of its competent law enforcement or judicial authorities.
Amendment 1146 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, as well as individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the online platform not to act after having received a notice, and against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 1156 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable access to or restrict the visibility of the information;
Amendment 1168 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to restrict the ability to monetise content provided by the recipients.
Amendment 1179 #
Proposal for a regulation
Article 17 – paragraph 2 – subparagraph 1 a (new)
Article 17 – paragraph 2 – subparagraph 1 a (new)
When the decision to remove or disable access to the information is followed by the transmission of this information in accordance with Article 15a, the period of at least six months referred to in paragraph 1 of this Article begins on the day on which the information was given to the recipient in accordance with Article 15.
Amendment 1181 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner and without undue delay and at the latest within seven days of the notification. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 1259 #
Proposal for a regulation
Article 19
Article 19
Amendment 1323 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall, after having issued a prior warning, suspend, for a reasonable period of time and after having issued a prior warning,, or terminate the provision of their services to recipients of the service that frequentpeatedly provide manifestly illegal content.
Amendment 1332 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall, after having issued a prior warning, suspend, for a reasonable period of time and after having issued a prior warning,, or terminate the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 1367 #
Proposal for a regulation
Article 22
Article 22
Amendment 1527 #
Proposal for a regulation
Article 25 – title
Article 25 – title
Very large online platforms, live streaming platforms, private messaging providers and search engines
Amendment 1532 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platform services, live streaming platform services, private messaging services and search engine services which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
Amendment 1548 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platform services, live streaming platform services, private messaging services and search engine services shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 1599 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platform services, live streaming platform services, private messaging services and search engine services shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 1638 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 1735 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. Very large online platforms that display advertising on their online interfaces shall conduct at their own expense, upon the request of advertisers, independent audits performed by organisations complying with the criteria set out in Article 28(2), on a reasonable frequency, under fair and proportionate conditions agreed upon between platforms and advertisers to: (a) conduct quantitative and qualitative assessment of cases where advertising is associated with illegal content or with content incompatible with their terms and conditions; (b) detect fraudulent use of their services to fund illegal activities; (c) assess the performance of their tools in terms of brand safety. The report shall include an audit opinion of the performance of the tools of a very large online platform in terms of brand safety, either positive, positive with comments or negative. Where the audit opinion is not positive, operational recommendations for specific measures to achieve compliance shall be provided. Very large online platforms shall make the result of that audit available to advertisers upon their request.
Amendment 1904 #
Proposal for a regulation
Article 37 – paragraph 5
Article 37 – paragraph 5
5. If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it mayshall request the participants to remove and, where necessary, revise the crisis protocol, including by taking additional measures.
Amendment 1907 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 1
Article 38 – paragraph 2 – subparagraph 1
2. Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. Those competent authorities shall have the same powers to carry out the tasks or supervise the sectors assigned to them as those attributed to the Digital Services Coordinator for the application and enforcement of this Regulation. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union.
Amendment 1927 #
Proposal for a regulation
Article 40 – paragraph 1
Article 40 – paragraph 1
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation, Sections 1 to 4, as well as Chapter IV.
Amendment 1930 #
Proposal for a regulation
Article 40 – paragraph 1 a (new)
Article 40 – paragraph 1 a (new)
1a. The Member State where the consumers have their habitual residence shall have jurisdiction for the purposes of Chapter III, Section 3.
Amendment 1931 #
Proposal for a regulation
Article 40 – paragraph 1 b (new)
Article 40 – paragraph 1 b (new)
1b. The Member State where the authority issuing the order is situated shall have jurisdiction for the purposes of Articles 8 and 9.
Amendment 1944 #
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 1 – point e
Article 41 – paragraph 2 – subparagraph 1 – point e
(e) the power to adopt interim measures to address repeated infringement of the obligations laid down in the Regulation or to avoid the risk of serious harm.
Amendment 1946 #
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 1 – point e a (new)
Article 41 – paragraph 2 – subparagraph 1 – point e a (new)
(ea) For the purposes of sub-paragraph (e), the powers of Digital Service Coordinator shall include the ability to request the relevant judicial authority to: (i) issue an order to remove content or to restrict access to an online interface or to order the explicit display of a warning to consumers when they access an online interface; (ii) order a provider of a hosting service to remove, disable or restrict access to an online interface; (iii) where appropriate, order domain registries or registrars to delete a fully qualified domain name and to allow the competent authority concerned to register it, including by requesting at third party or other public authority to implement such measures; or, (iv) order other appropriate measures under the circumstances.
Amendment 1964 #
Proposal for a regulation
Article 42 a (new)
Article 42 a (new)
Article 42a In accordance with the conditional exemption from liability laid down in Article 1(1)(a), Member States shall ensure that the penalty for repeatedly failing to comply with the obligations under this Regulation includes the horizontal loss of the liability exemption for the intermediary service provider.
Amendment 1968 #
Proposal for a regulation
Article 43 – paragraph 1
Article 43 – paragraph 1
Recipients of the service, as well as other parties having a legitimate interest and meeting relevant criteria of expertise and independence from any online hosting services provider or platform shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
Amendment 2066 #
Proposal for a regulation
Article 48 – paragraph 5
Article 48 – paragraph 5
5. The Board may invite experts and observers to attend its meetings, and mayshall cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available.