Activities of Jean-Lin LACAPELLE related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (continuation of debate)
Amendments (273)
Amendment 189 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, accessible, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non- discrimination.
Amendment 196 #
Proposal for a regulation
Recital 5
Recital 5
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normalfrequently provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activitiesillegal content. __________________ 26Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
Amendment 203 #
Proposal for a regulation
Recital 7
Recital 7
(7) In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to providers of intermediary services irrespective of their place of establishment or residence, in so far as they provide services in the Union, as evidenced by a substantial connection to the Union.
Amendment 205 #
Proposal for a regulation
Recital 8
Recital 8
Amendment 210 #
Proposal for a regulation
Recital 8
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. __________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
Amendment 223 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “‘illegal content”’ should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non- authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in questionstrictly defined, with the term covering any activity stipulated or punishable by the applicable law of the Member State concerned.
Amendment 256 #
Proposal for a regulation
Recital 17
Recital 17
Amendment 279 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.Does not affect the English version.)
Amendment 280 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content act to remove or to disable access to the illegal content when such content is deemed to be illegal according to Union or Member State law. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness throughf the illegal content, in particular, its own- initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligentn economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 283 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. Providers of hosting services which will be liable in this way should ensure they have the possibility of redress against the trader actually responsible if this is not provided for in the conditions of use of their services.
Amendment 286 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
Amendment 290 #
Proposal for a regulation
Recital 23 a (new)
Recital 23 a (new)
(23a) European consumers should be able to safely purchase products and services online, regardless of whether a product or service has been produced in the Union or not. Online platforms allowing distance contracts with third- country traders should establish, before approving that trader on their platform, that the third-country trader complies with the relevant Union or national law on product safety and product compliance. In addition, if the third- country trader does not provide an economic operator inside the Union liable for the product safety, online platforms should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation.
Amendment 292 #
Proposal for a regulation
Recital 24
Recital 24
(24) The exemptions from liability established in this Regulation should not affect the possibility of injunctions of different kinds against providers of intermediary services, even where they meet the conditions set out as part of those exemptions. Such injunctions could, in particular, consist of orders by courts or administrative authorities requiring the termination or prevention of any infringement, including the removal of illegal content specified in such orders, issued in compliance with Union law, or the disabling of access to it.Does not affect the English version.)
Amendment 303 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereDoes not affect the English version.)
Amendment 306 #
Proposal for a regulation
Recital 26
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or court orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.
Amendment 318 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.Does not affect the English version.)
Amendment 326 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.Does not affect the English version.)
Amendment 330 #
Proposal for a regulation
Recital 30
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.Does not affect the English version.)
Amendment 334 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the court order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.
Amendment 340 #
Proposal for a regulation
Recital 33
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.Does not affect the English version.)
Amendment 348 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary clearly to establish a clear and balanced set of harmonised due diligence obligations forthe legal obligations which will apply to providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 350 #
Proposal for a regulation
Recital 35
Recital 35
Amendment 357 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
Amendment 364 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. To this end, the use of algorithmic decision- making processes should be disclosed to users whenever they are employed.
Amendment 371 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC40 . __________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 382 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, with the exception of those which play an architectural role, put in place user- friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 386 #
Proposal for a regulation
Recital 41
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.Does not affect the English version.)
Amendment 393 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 399 #
Proposal for a regulation
Recital 43
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 with the exception of those which are owned or controlled by bodies established outside the European Union, unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. __________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 407 #
Proposal for a regulation
Recital 46
Recital 46
Amendment 420 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 421 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfoundedillegal notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content whenever it contravenes the law and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish strictWhere the platform decides to suspend accounts which concern measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditionsatters of public interest, such as those belonging to political figures or candidates for election, it can act only ofn the online platformsbasis of a preliminary court injunction. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 431 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).Does not affect the English version.)
Amendment 432 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delaypromptly inform the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 442 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the oOnline pPlatforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guamay also ask for support from the Digital Services Coordinator in carrying out these specific obligations. If the trader is established outside the Union and does not cooperate or does not provide sufficient information for the verification of its compliance with the relevant Union or Member State law, this trader should not be admitted to operate and sell its products on the platform. If the trader is already on the platform and should not meet the above criteria, the platform should suspend that trader's account. The trader should be granteeingd the reliapossibility of the information towards consumer or other interested parties. Such oredress in the event of suspension of the business account. Online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . __________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 451 #
Proposal for a regulation
Recital 51
Recital 51
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipients of the service in the Union under the supervision of the Commission.
Amendment 458 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.Does not affect the English version.)
Amendment 472 #
Proposal for a regulation
Recital 54
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10 % of the Union population and if the online platforms have an annual global turnover exceeding EUR 100 million. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
Amendment 473 #
Proposal for a regulation
Recital 56
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess, under the supervision of the Commission and the European Board for Digital Services, the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
Amendment 479 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 485 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 486 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, or adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 492 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Boardcoordinators, the Board and the Commission without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
Amendment 497 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. This option must be easily accessible and must correspond to a predefined parameter profile.
Amendment 505 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinators of establishmentthe Member States, the Board or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 512 #
Proposal for a regulation
Recital 66
Recital 66
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standardsIn order to be properly distributed, such standards must be supported by a high quality level monitored by the public authority. They could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate.
Amendment 515 #
Proposal for a regulation
Recital 67
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
Amendment 518 #
Proposal for a regulation
Recital 68
Recital 68
Amendment 521 #
Proposal for a regulation
Recital 68
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content, for example sharing of images depicting child sexual abuse or terrorist content, should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
Amendment 523 #
Proposal for a regulation
Recital 69
Recital 69
Amendment 526 #
Proposal for a regulation
Recital 69
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan.
Amendment 531 #
Proposal for a regulation
Recital 71
Recital 71
Amendment 537 #
Proposal for a regulation
Recital 74
Recital 74
(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board, the Member States and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate.
Amendment 540 #
Proposal for a regulation
Recital 76
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised, without this identification serving as a presumption of recognition of establishment for tax purposes. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.
Amendment 547 #
Proposal for a regulation
Recital 84
Recital 84
(84) The Digital Services Coordinator should regularly publish a report on the activities carried out under this Regulation. Given that the Digital Services Coordinator is also made aware of orders to take action against illegal content or to provide information regulated by this Regulation through the common information sharing system, the Digital Services Coordinator should include in its annual report the number and categories of these orders addressed to providers of intermediary services issued by judicial and administrative authorities in its Member State.Does not affect the English version.)
Amendment 550 #
Proposal for a regulation
Recital 85
Recital 85
(85) Where a Digital Services Coordinator requests another Digital Services Coordinator to take action, the requesting Digital Services Coordinator, or the Board in case it issued a recommendation to assess issues involving more than three Member States, should be able to refer the matter to the Commission in case of any disagreement as to the assessments or the measures taken or proposed or a failure to adopt any measures. The Commission, on the basis of the information made available by the concerned authorities, should accordingly be able to request the competent Digital Services Coordinator to re-assess the matter and take the necessary measures to ensure compliance within a defined and reasonable time period. This possibility is without prejudice to the Commission’s general duty to oversee the application of, and where necessary enforce, Union law under the control of the Court of Justice of the European Union in accordance with the Treaties. A failure by the Digital Services Coordinator of establishment to take any measures pursuant to such a request may also lead to the Board’s or the Commission’s intervention under Section 3 of Chapter IV of this Regulation, where the suspected infringer is a very large online platform.
Amendment 558 #
Proposal for a regulation
Recital 89
Recital 89
Amendment 563 #
Proposal for a regulation
Recital 90
Recital 90
(90) For that purpose, tThe Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation.
Amendment 568 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an careful assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 576 #
Proposal for a regulation
Recital 97
Recital 97
(97) The Commission should reBoard should have full decision-makin free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulatg powers in the investigation and enforcement procedures set out in this Regulation. The Commission, which shall provide the Board with all the technical assistance at its disposal, should have full powers of enforcement concerning the Board’s decisions. Once the Commission, on the instructions of the Board, initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request or that of the Board, in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers, including, as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necess, those occasions when it exercises them autonomously in order to submit proposals to the Boaryd.
Amendment 580 #
Proposal for a regulation
Recital 98
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the Commission should, on the initiation of the relevant procedure at the decision of the Board, have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties.
Amendment 583 #
Proposal for a regulation
Recital 98
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the Commission should have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties.
Amendment 584 #
Proposal for a regulation
Recital 99
Recital 99
(99) In particular, the Commission should have access to any relevant documents, data and information necessaryThe Commission, to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Rregulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers
Amendment 600 #
Proposal for a regulation
Article premier – paragraph 1 – point a
Article premier – paragraph 1 – point a
(a) a framework for the possible conditional exemption from liability of providers of intermediary services;
Amendment 602 #
Proposal for a regulation
Article premier – paragraph 1 – point b
Article premier – paragraph 1 – point b
(b) rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services;
Amendment 611 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 629 #
Proposal for a regulation
Article 1 – paragraph 5 – point b a (new)
Article 1 – paragraph 5 – point b a (new)
(ba) Directive (EU) 2019/882
Amendment 653 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
Article 2 – paragraph 1 – point d – introductory part
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union; in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as:.
Amendment 655 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
Article 2 – paragraph 1 – point d – introductory part
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union; in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as:
Amendment 656 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
Article 2 – paragraph 1 – point d – indent 1
Amendment 658 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
Article 2 – paragraph 1 – point d – indent 1
Amendment 663 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
Article 2 – paragraph 1 – point d – indent 2
Amendment 685 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an or activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;criminal, administrative or civil legal framework of a Member State.
Amendment 687 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;Does not affect the English version.)
Amendment 727 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
Amendment 732 #
Proposal for a regulation
Article 2 – paragraph 1 – point q
Article 2 – paragraph 1 – point q
(q) ‘terms and conditions’ means all terms and conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services. Providers of online services must not impose requirements in their online conditions which exceed what is stipulated in the national regulations of the country where the service is provided.
Amendment 739 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) "persons with disabilities" means person within the meaning of Article 3(1) of Directive(EU) 2019/882;
Amendment 749 #
Proposal for a regulation
Article 3 – paragraph 1 – introductory part
Article 3 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, the service provider shall not in principle be liable for the information transmitted, on condition that the provider:
Amendment 751 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not in principle be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that:
Amendment 755 #
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
Article 5 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the storage of information provided by a recipient of the service the service provider shall not in principle be liable for the information stored at the request of a recipient of the service on condition that the provider:
Amendment 757 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
Article 5 – paragraph 1 – point a
(a) dDoes not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; oraffect the English version.)
Amendment 759 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously, acts to remove or to disable access to the illegal content if the content or activity is to be deemed illegal under Article 2 (g).
Amendment 762 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content.Does not affect the English version.)
Amendment 767 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. In addition, the liability exemption in paragraph 1 shall not apply in case an online platform allows consumers to conclude distance contracts with third-country traders when there is no economic operator inside the Union liable for the product safety on behalf of that trader.
Amendment 768 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. A procedure for redress shall be provided for the online platform against the trader who is actually responsible.
Amendment 788 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Amendment 796 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
Amendment 799 #
Proposal for a regulation
Article 8 – title
Article 8 – title
Amendment 807 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
Amendment 817 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
Article 8 – paragraph 2 – point a – indent 1
Amendment 822 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2
Article 8 – paragraph 2 – point a – indent 2
Amendment 826 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
Article 8 – paragraph 2 – point a – indent 3 a (new)
— precise details concerning the identity or identification of the recipients specifically concerned by the order;
Amendment 899 #
Proposal for a regulation
Article 10 – paragraph 1
Article 10 – paragraph 1
1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means and by telephone, with Member States’ authorities, the Commission and the Board referred to in Article 47 for the application of this Regulation.
Amendment 904 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. Providers of intermediary services shall make public, in a clear and user- friendly manner, the information necessary to easily identify and communicate with their single points of contact.
Amendment 923 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible formatThe use of algorithmic decision-making processes shall be notified to users whenever they are applied. The users shall be able, where appropriate, to switch easily from interaction with the algorithmic system to human interaction. The information shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. Providers of intermediary services shall list the restrictions in relation to the use of their service for the dissemination of content deemed illegal under Union or Member State law in a clear and user- friendly manner, and differentiate the list from the general conditions for the use of their service so as to make the user aware of what is deemed illegal under the law and what is subject to the terms and conditions for the use of the service.
Amendment 930 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be specifically presented to users when they subscribe to the service, set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 953 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. The online conditions of online service providers may not require more than what is required by the regulations of the country in which the service is provided.
Amendment 982 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders;Does not affect the English version.)
Amendment 984 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
Amendment 991 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, measures and tools used for the purpose of content moderation, including the impact of algorithmic decision-making compared to human review, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 1031 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
Amendment 1032 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligentn economic operator can identify the illegality of the content in quesestablish, in a diligent manner and without discrimination, whether the notice concerns illegal content as defined in Article 2(g) of these Regulations. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 1037 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content. The possibility of identifying, on the basis of a list drawn up in agreement with the Digital Service Coordinator, the type of illegal content to which the individual or entity presumes the reported content below, to should also be foreseen;
Amendment 1040 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content;Does not affect the English version.)
Amendment 1048 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;Does not affect the English version.)
Amendment 1053 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
Amendment 1106 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access and the duration;
Amendment 1110 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
Amendment 1113 #
Proposal for a regulation
Article 15 – paragraph 2 – point d
Article 15 – paragraph 2 – point d
(d) where the decision concerns allegedly illegal contentcontent deemed to be illegal, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground;
Amendment 1114 #
Proposal for a regulation
Article 15 – paragraph 2 – point e
Article 15 – paragraph 2 – point e
Amendment 1117 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
Amendment 1140 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, with the exception of those owned or controlled by bodies outside the Union.
Amendment 1147 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 1148 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective and user-friendly internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 1174 #
Proposal for a regulation
Article 17 – paragraph 1 a (new)
Article 17 – paragraph 1 a (new)
1a. Internal complaint-handling services concerning recipients established in the European Union shall be established in the European Union.
Amendment 1176 #
Proposal for a regulation
Article 17 – paragraph 2
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. The complainant shall be able to enter free written explanations in addition to the pre-established complaint options.
Amendment 1184 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 1185 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objectiveobjective and transparent manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 1187 #
Proposal for a regulation
Article 17 – paragraph 4
Article 17 – paragraph 4
4. Online platforms shall promptly inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities.
Amendment 1196 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means.
Amendment 1198 #
Proposal for a regulation
Article 18
Article 18
Amendment 1217 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point b
Article 18 – paragraph 2 – subparagraph 1 – point b
(b) it has the necessary expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute;Does not affect the English version.)
Amendment 1224 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
Amendment 1229 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
Article 18 – paragraph 2 – subparagraph 1 – point d
(d) it is capable of settling dispute in a swift, efficient, accessible for persons with disabilities, and cost-effective manner and in at least one official language of the Union and at least in the language of the recipient to whom the decision referred to in Article 17 is addressed;
Amendment 1238 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
Article 18 – paragraph 2 – subparagraph 1 – point e
(e) the dispute settlement takes place in accordance with clear and fairtransparent rules of procedure.
Amendment 1258 #
Proposal for a regulation
Article 19
Article 19
Amendment 1281 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner.;
Amendment 1283 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
Article 19 – paragraph 2 – point c a (new)
(ca) it neither expresses nor relays a political or partisan position, nor represents an economic interest, with the exception of consumer protection and defence organisations and environmental organisations.
Amendment 1319 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonablespecified period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content that has been duly declared illegal as defined in Article 2(g). The online platform may request support from the Digital Service Coordinator to establish the frequency for which account suspension is deemed necessary and to set the duration of the suspension.
Amendment 1326 #
Proposal for a regulation
Article 20 – paragraph 1 a (new)
Article 20 – paragraph 1 a (new)
1a. Online platforms shall not activate the notice and action mechanism described in Article 14 if the intended recipients are elected officials or candidates for election during electoral campaigns.
Amendment 1327 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants, including trusted flaggers, that frequently submit notices or complaints that are manifestly unfounded. If individuals, entities or complainants, including trusted flaggers, continue to submit notices or complaints which are manifestly unfounded or prove to be unfounded following the imposition of a measure suspending the processing of notices and complaints, online platforms shall suspend the provision of their services to those recipients for a reasonable period of time, after having issued a prior warning.
Amendment 1335 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
Amendment 1341 #
Proposal for a regulation
Article 20 – paragraph 3 – subparagraph 1 a (new)
Article 20 – paragraph 3 – subparagraph 1 a (new)
The assessment must be carried out by qualified staff provided with dedicated training on the applicable legal framework.
Amendment 1357 #
Proposal for a regulation
Article 21 – paragraph 1 a (new)
Article 21 – paragraph 1 a (new)
1a. Where an online trading platform has information which might suggest that a criminal offence of the nature of counterfeiting or fraud has taken place, is taking place or is likely to take place, it shall inform the law enforcement and judiciary services of the Member State concerned of its suspicion without delay and shall provide all the relevant information which is available. It shall also be able to expedite internal enquiries and, depending on their outcome, to withdraw the notice(s) in question. It shall transmit the details and the outcome of such an enquiry to the above-mentioned services of the Member State concerned.
Amendment 1359 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform without undue delay the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol.
Amendment 1360 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative and may alsor inform Europol.
Amendment 1374 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
Amendment 1384 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
Article 22 – paragraph 1 – point c
(c) the bank account details of the trader, where the trader is a natural person;
Amendment 1394 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
Article 22 – paragraph 1 – point f
(f) a self-certification by the trader committing to only offerthat products or services thatprovided comply with the applicable rules of Union lawrelevant Union or national law on product safety and product compliance.
Amendment 1401 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assessassess, with the support of the Digital Service Coordinator if needed, whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sourcesand official sources. Online platforms allowing distance contracts with third-country traders shall establish that the third-country trader complies with the relevant Union or national law on product safety and product compliance before giving them access its services offered in the Union and, where appropriate, with the support of the Digital service Coordinator. The Digital Service Coordinator may request support from market surveillance or customs authorities to assess the information provided by the trader.
Amendment 1430 #
Proposal for a regulation
Article 22 – paragraph 4
Article 22 – paragraph 4
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently, asking the trader to notify any changes and confirm the information held by the online platform once a year. After the contractual relationship has ended, the online platform shall delete the information.
Amendment 1433 #
Proposal for a regulation
Article 22 – paragraph 4
Article 22 – paragraph 4
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information, where necessary following the expiry of its legal retention period.
Amendment 1469 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
Amendment 1474 #
Proposal for a regulation
Article 23 – paragraph 2
Article 23 – paragraph 2
2. Online platforms shall publish, at least once every sixtwelve months, information on the average monthly active recipients of the service in each Member State, calculated as an average over the period of the past sixtwelve months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
Amendment 1477 #
Proposal for a regulation
Article 23 – paragraph 3
Article 23 – paragraph 3
3. Online platforms shall communicate to the Digital Services Coordinators of establishmentthe Member States, upon itstheir request, the information referred to in paragraph 2, updated to the moment of such request. Thate Digital Services Coordinators may require the online platform to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information shall not include personal data.
Amendment 1491 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed, as well as their nationality;
Amendment 1499 #
Proposal for a regulation
Article 24 – paragraph 1 – subparagraph 1 a (new)
Article 24 – paragraph 1 – subparagraph 1 a (new)
Special attention shall be given to recipients of the service who are minors. When advertising is addressed to minors, online platforms shall indicate in a clear, easy and unambiguous manner that such advertising targets this group of recipients.
Amendment 1530 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. , and whose annual global turnover is equal to or greater than EUR 100 million.
Amendment 1536 #
Proposal for a regulation
Article 25 – paragraph 3
Article 25 – paragraph 3
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features.
Amendment 1546 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks, including where they result from a voluntary action taken by the platform on the basis of its technological, social or economic model:
Amendment 1559 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
(a) the dissemination of illegal content through their services;Does not affect the English version.)
Amendment 1583 #
Proposal for a regulation
Article 26 – paragraph 1 – subparagraph 1 a (new)
Article 26 – paragraph 1 – subparagraph 1 a (new)
The Board shall approve the report.
Amendment 1589 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
Amendment 1592 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall also take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
Amendment 1603 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall, in collaboration with the Commission, put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 1610 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions;
Amendment 1611 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adaptchecking content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions;
Amendment 1617 #
Proposal for a regulation
Article 27 – paragraph 1 – point d
Article 27 – paragraph 1 – point d
Amendment 1620 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
Article 27 – paragraph 1 – point e
Amendment 1633 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which shall include the following:
Amendment 1662 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
(b) any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocolsfor online advertising referred to in Article 376.
Amendment 1668 #
Proposal for a regulation
Article 28 – paragraph 2 – point b
Article 28 – paragraph 2 – point b
(b) have proven expertise in the area of risk management, technical competence and capabilities certified by qualified and accredited certification body;
Amendment 1670 #
Proposal for a regulation
Article 28 – paragraph 2 – point c
Article 28 – paragraph 2 – point c
(c) have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.
Amendment 1723 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
Article 30 – paragraph 2 – point b
(b) the identity and nationality of the natural or legal person on whose behalf the advertisement is displayed;
Amendment 1751 #
Proposal for a regulation
Article 31 – paragraph 1
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment, the Digital Services Coordinator of destination or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
Amendment 1756 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment, the Digital Services Coordinator of destination or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1).
Amendment 1766 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted by the Digital Services Coordinators, researchers shall be affiliated with European academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 1772 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1776 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment, the Digital Services Coordinator of destination or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons:
Amendment 1782 #
Proposal for a regulation
Article 31 – paragraph 6 – point b
Article 31 – paragraph 6 – point b
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of recipients’ confidential information, in particularcluding trade secrets.
Amendment 1785 #
Proposal for a regulation
Article 31 – paragraph 7 – subparagraph 2
Article 31 – paragraph 7 – subparagraph 2
The Digital Services Coordinator of establishment, the Digital Services Coordinator of destination or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request.
Amendment 1792 #
Proposal for a regulation
Article 32 – paragraph 3 – point a
Article 32 – paragraph 3 – point a
(a) cooperating with the Digital Services Coordinator of establishment, the Digital Services Coordinator of destination and the Commission for the purpose of this Regulation;
Amendment 1794 #
Proposal for a regulation
Article 32 – paragraph 5
Article 32 – paragraph 5
5. Very large online platforms shall communicate the name and contact details of the compliance officer to the Digital Services Coordinator of establishments and the Commission.
Amendment 1814 #
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
Article 34 – paragraph 1 – introductory part
1. The Board, in cooperation with the Commission, shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies at least for the following:
Amendment 1818 #
Proposal for a regulation
Article 34 – paragraph 1 – point b
Article 34 – paragraph 1 – point b
Amendment 1836 #
Proposal for a regulation
Article 34 – paragraph 1 – subparagraph 1 a (new)
Article 34 – paragraph 1 – subparagraph 1 a (new)
Amendment 1837 #
Proposal for a regulation
Article 34 – paragraph 2
Article 34 – paragraph 2
2. The Board, in cooperation with the Commission, shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question.
Amendment 1843 #
Proposal for a regulation
Article 35
Article 35
Amendment 1852 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may inv, in agreement witeh the very largBoard, may invite online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1895 #
Proposal for a regulation
Article 37
Article 37
Amendment 1916 #
Proposal for a regulation
Article 38 – paragraph 3 a (new)
Article 38 – paragraph 3 a (new)
3a. Member States shall ensure that their Digital Services Coordinators are informed by the relevant national, local and regional authorities on the diversity of platform sectors and issues covered by this Regulation;
Amendment 1922 #
Proposal for a regulation
Article 39 – paragraph 2
Article 39 – paragraph 2
Amendment 1923 #
Proposal for a regulation
Article 39 – paragraph 3
Article 39 – paragraph 3
Amendment 1925 #
Proposal for a regulation
Article 39 – paragraph 3
Article 39 – paragraph 3
3. Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law.
Amendment 1929 #
Proposal for a regulation
Article 40 – paragraph 1 a (new)
Article 40 – paragraph 1 a (new)
1a. The Member State in which an event which calls into question the service provider takes place or in which the natural or legal person who is the recipient of the service resides, shall also have jurisdiction for the purposes of Chapters III and IV of this Regulation.
Amendment 1932 #
Proposal for a regulation
Article 40 – paragraph 3
Article 40 – paragraph 3
3. Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of Chapters III and IV. Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member States and ensure. Where a number of Member States decide to exercise jurisdiction with regard to the same service provider, they shall coordinate their actions, where necessary via the Board, ensuring that the principle of ne bis in idem is respected.
Amendment 1934 #
Proposal for a regulation
Article 40 – paragraph 3
Article 40 – paragraph 3
3. Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of Chapters III and IV. Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member States andto ensure that the principle of ne bis in idem is respected.
Amendment 1937 #
Proposal for a regulation
Article 40 – paragraph 4
Article 40 – paragraph 4
Amendment 1950 #
Proposal for a regulation
Article 41 – paragraph 3 – subparagraph 1 – point a
Article 41 – paragraph 3 – subparagraph 1 – point a
(a) require the management body of the providers, within a reasonable time period, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken within a specific period;
Amendment 1952 #
Proposal for a regulation
Article 41 – paragraph 3 – subparagraph 1 – point b
Article 41 – paragraph 3 – subparagraph 1 – point b
(b) where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.
Amendment 1956 #
Proposal for a regulation
Article 42 – paragraph 2
Article 42 – paragraph 2
2. Penalties shall be effective, proportionate and dissuasive. Member States shall notify the Commission and the Board of those rules and of those measures and shall notify ithem, without delay, of any subsequent amendments affecting them.
Amendment 1958 #
Proposal for a regulation
Article 42 – paragraph 3
Article 42 – paragraph 3
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or the global annual turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1 % of the annual income or the global annual turnover of the provider concerned.
Amendment 1961 #
Proposal for a regulation
Article 42 – paragraph 4
Article 42 – paragraph 4
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily global turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.
Amendment 1966 #
Proposal for a regulation
Article 43 – paragraph 1
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority, pursuant to Article 40(2).
Amendment 1977 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
Article 44 – paragraph 2 – point a
(a) the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;Does not affect the English version.)
Amendment 2001 #
Proposal for a regulation
Article 45 – paragraph 5
Article 45 – paragraph 5
5. Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the CommissionBoard, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4. The Board shall resend the request set out in paragraph 1 to the Digital Services Coordinator of establishment. The Digital Services Coordinator of establishment shall assess the request and transmit its reply in accordance with the conditions set out in paragraphs 3 and 4. If the Board, having drawn up a request involving at least three Member States or a request pursuant to the first subparagraph, has not received a reply by the deadline set out in paragraph 4, or if it does not agree with the assessment of the Digital Services Coordinator of establishment, it shall adopt a decision and shall transmit instructions to the Commission concerning the measures to be taken on the basis of that decision.
Amendment 2005 #
Proposal for a regulation
Article 45 – paragraph 6
Article 45 – paragraph 6
6. The Commission shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Boardcarry out the instructions received pursuant to paragraph 5 without delay.
Amendment 2009 #
Proposal for a regulation
Article 45 – paragraph 7
Article 45 – paragraph 7
Amendment 2024 #
Proposal for a regulation
Article 46 – paragraph 2
Article 46 – paragraph 2
2. Where a Digital Services Coordinator of establishment has reasons to suspect that a very large online platform infringed this Regulation, it may request the CommissionBoard to take the necessary investigatory and enforcement measures to ensure compliance with this Regulation in accordance with Section 3. Such a request shall contain all information listed in Article 45(2) and set out the reasons for requesting the Commisan intervention, on which the Board shall act by vote, transmitting the conclusions to intervenethe Commission.
Amendment 2029 #
Proposal for a regulation
Article 47 – paragraph 1
Article 47 – paragraph 1
1. An independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary service body comprising the Digital Services Coordinators named ‘European Board for Digital Services’ (the ‘Board’) is established.
Amendment 2033 #
Proposal for a regulation
Article 47 – paragraph 2 – introductory part
Article 47 – paragraph 2 – introductory part
2. The Board shall advise the Digital Services Coordinators and the Commission in accordance with this Regulation to achieve the following objectives:work together with the Commission to ensure the monitoring of providers of intermediary services and the application of this Regulation.
Amendment 2036 #
Proposal for a regulation
Article 47 – paragraph 2 – point a
Article 47 – paragraph 2 – point a
Amendment 2040 #
Proposal for a regulation
Article 47 – paragraph 2 – point b
Article 47 – paragraph 2 – point b
Amendment 2043 #
Proposal for a regulation
Article 47 – paragraph 2 – point c
Article 47 – paragraph 2 – point c
Amendment 2046 #
Proposal for a regulation
Article 48 – title
Article 48 – title
Amendment 2051 #
Proposal for a regulation
Article 48 – paragraph 1
Article 48 – paragraph 1
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. The meeting is deemed valid when at least two third of the eligible members are present.
Amendment 2052 #
Proposal for a regulation
Article 48 – paragraph 1
Article 48 – paragraph 1
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them.
Amendment 2057 #
Proposal for a regulation
Article 48 – paragraph 2 – subparagraph 2
Article 48 – paragraph 2 – subparagraph 2
The Board shall adopt all its acts by simplequalified majority.
Amendment 2060 #
Proposal for a regulation
Article 48 – paragraph 3
Article 48 – paragraph 3
3. The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure, which shall provide the secretariat and administrative and analytical support for the Board’s activities under this Regulation. The Board member representing the Member State which holds the Council Presidency shall convene the meetings and prepare the agenda in coordination with the Commission.
Amendment 2061 #
Proposal for a regulation
Article 48 – paragraph 3 a (new)
Article 48 – paragraph 3 a (new)
3a. The Board shall adopt its rules of procedure.
Amendment 2063 #
Proposal for a regulation
Article 48 – paragraph 4
Article 48 – paragraph 4
Amendment 2068 #
Proposal for a regulation
Article 48 – paragraph 6
Article 48 – paragraph 6
Amendment 2085 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
Article 49 – paragraph 1 – point d
(d) advise the Commission to take the measures referred to indecide on the measures to be taken under Articles 51 and, where requested by the Commission, adopt opinions on draft Commission measures concerning very large online platforms in accordance with, 55, 56, 57, 58, 59 and 60 of this Regulation;.
Amendment 2091 #
Proposal for a regulation
Article 49 – paragraph 2
Article 49 – paragraph 2
Amendment 2097 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
Article 50 – paragraph 1 – subparagraph 2
The Digital Services Coordinator of the Member State concerned acting on its own initiative, the Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period.
Amendment 2101 #
Proposal for a regulation
Article 50 – paragraph 2
Article 50 – paragraph 2
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Digital Services Coordinator of the Member State concerned, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35.
Amendment 2103 #
Proposal for a regulation
Article 50 – paragraph 2
Article 50 – paragraph 2
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may includerecommend, where appropriate, participation in a code of conduct as provided for in Article 35.
Amendment 2106 #
Proposal for a regulation
Article 50 – paragraph 3 – subparagraph 2
Article 50 – paragraph 3 – subparagraph 2
Where the Digital Services Coordinator of establishment has concerns on the ability of the measures to terminate or remedy the infringement, it may request the very large online platform concerned to subject itself to an additional, independent audit to assess the effectiveness ofat the Board reconsider those measures in terminating or remedying the infringement. In that case, that platform shall send the audit report to that Digital Services Coordinator, the Commission and the Board within four months from the decision referred to in the first subparagraph. When requesting such an additional audit, the Digital Services Coordinator may specify a particular audit organisation that is to carry out the audit, at the expense of the platform concerned, selected on the basis of criteria set out in Article 28(2)atter. The Board shall adopt a decision, in the light of which it may or may not exercise the powers set out in Articles 51, 55, 56, 57, 58, 59 and 60.
Amendment 2108 #
Proposal for a regulation
Article 50 – paragraph 4
Article 50 – paragraph 4
Amendment 2116 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendainstructions or on its own initiative after consulting the Board, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: is suspected of infringing one of the provisions of this Regulation.
Amendment 2121 #
Proposal for a regulation
Article 51 – paragraph 1 – point a
Article 51 – paragraph 1 – point a
Amendment 2122 #
Proposal for a regulation
Article 51 – paragraph 1 – point b
Article 51 – paragraph 1 – point b
Amendment 2125 #
Proposal for a regulation
Article 51 – paragraph 1 – point c
Article 51 – paragraph 1 – point c
Amendment 2127 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Article 51 – paragraph 2 – subparagraph 1
Where the Commission decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. If the Commission decides not to initiate proceedings pursuant to paragraph 1, it shall inform the Board in writing of its reasons.
Amendment 2129 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Article 51 – paragraph 2 – subparagraph 1
Where the Commission initiates or decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
Amendment 2133 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 2
Article 51 – paragraph 2 – subparagraph 2
Amendment 2136 #
Proposal for a regulation
Article 51 – paragraph 3 – introductory part
Article 51 – paragraph 3 – introductory part
3. The Digital Services Coordinator referred to in Articles 45(7), 46(2) and 50(1), as applicable, shall, without undue delay upon being informed, transmit to the Commission:
Amendment 2143 #
Proposal for a regulation
Article 52 – paragraph 1
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period.
Amendment 2147 #
Proposal for a regulation
Article 52 – paragraph 3
Article 52 – paragraph 3
3. Where the Commission requires the very large online platform concerned or other person referred to in Article 52(1) to supply information by decision, it shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which it is to be provided. It shall also indicate the penalties provided for in Article 59 and indicate or impose the periodic penalty payments provided for in Article 60. It shall further indicate the right to have the decision adversely affecting it which was taken under this Article reviewed by the Court of Justice of the European Union.
Amendment 2165 #
Proposal for a regulation
Article 55 – paragraph 1
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, and after consulting the Board, order interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement.
Amendment 2166 #
Proposal for a regulation
Article 55 – paragraph 1
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission and the Board may, by decision, order interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement.
Amendment 2171 #
Proposal for a regulation
Article 56 – paragraph 1
Article 56 – paragraph 1
1. If, during proceedings under this Section, the very large online platform concerned offers commitments to ensure compliance with the relevant provisions of this Regulation, the Commission may, by decision and after consulting the Board, make those commitments binding on the very large online platform concerned and declare that there are no further grounds for action.
Amendment 2172 #
Proposal for a regulation
Article 56 – paragraph 1
Article 56 – paragraph 1
1. If, during proceedings under this Section, the very large online platform concerned offers commitments to ensure compliance with the relevant provisions of this Regulation, the Commission may bysubmit to the Board a decision makeing those commitments binding on the very large online platform concerned and declareing that there are no further grounds for action.
Amendment 2174 #
Proposal for a regulation
Article 56 – paragraph 2 – introductory part
Article 56 – paragraph 2 – introductory part
2. The Commission may, upon requestinstruction by the Board or on its own initiative, reopen the proceedings:
Amendment 2176 #
Proposal for a regulation
Article 56 – paragraph 3
Article 56 – paragraph 3
3. Where the Commission considers that the commitments offered by the very large online platform concerned are unable to ensure effective compliance with the relevant provisions of this Regulation, it shall reject those commitments in a reasoned decision, in agreement with the Board, when concluding the proceedings.
Amendment 2177 #
Proposal for a regulation
Article 56 – paragraph 3
Article 56 – paragraph 3
3. Where the Commission considers that the commitments offered by the very large online platform concerned are unable to ensure effective compliance with the relevant provisions of this Regulation, it shall submit to the Board a decision to reject those commitments in a reasoned decision when concluding the proceedings.
Amendment 2181 #
Proposal for a regulation
Article 57 – paragraph 1
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission and the Board may also order that platform to provide access to, and explanations relating to, its databases and algorithms.
Amendment 2187 #
Proposal for a regulation
Article 58 – paragraph 1 – introductory part
Article 58 – paragraph 1 – introductory part
1. The Commission shall adopt a non- compliance decision, after consulting the Board, where it finds that the very large online platform concerned does not comply with one or more of the following:
Amendment 2190 #
Proposal for a regulation
Article 58 – paragraph 1 – introductory part
Article 58 – paragraph 1 – introductory part
1. The CommissionBoard shall adopt a non- compliance decision where it finds that the very large online platform concerned does not comply with one or more of the following:
Amendment 2194 #
Proposal for a regulation
Article 58 – paragraph 2
Article 58 – paragraph 2
2. Before adopting the decision pursuant to paragraph 1, the CommissionBoard shall communicate its preliminary findings to the very large online platform concerned. In the preliminary findings, the CommissionBoard shall explain the measures that it considers taking, or that it considers that the very large online platform concerned should take, in order to effectively address the preliminary findings.
Amendment 2197 #
Proposal for a regulation
Article 58 – paragraph 3
Article 58 – paragraph 3
3. In the decision adopted pursuant to paragraph 1 the CommissionBoard shall order the very large online platform concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within a reasonable time period and to provide information on the measures that that platform intends to take to comply with the decision.
Amendment 2199 #
Proposal for a regulation
Article 58 – paragraph 4
Article 58 – paragraph 4
4. The very large online platform concerned shall provide the Board and the Commission with a description of the measures it has taken to ensure compliance with the decision pursuant to paragraph 1 upon their implementation.
Amendment 2202 #
Proposal for a regulation
Article 58 – paragraph 5
Article 58 – paragraph 5
5. Where the Board, on its own initiative or on a proposal by the Commission find, deems that the conditions of paragraph 1 are not met, it shall close the investigation by a decision.
Amendment 2203 #
Proposal for a regulation
Article 58 – paragraph 5
Article 58 – paragraph 5
5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision approved by the Board.
Amendment 2210 #
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
Article 59 – paragraph 1 – introductory part
1. In the decision pursuant to Article 58, the CommissionBoard may impose on the very large online platform concerned fines not exceeding 6 % of its totglobal turnover in the preceding financial year where it finds that that platform, intentionally or negligently:
Amendment 2215 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
Article 59 – paragraph 2 – introductory part
2. The CommissionBoard may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1 % of the totglobal turnover in the preceding financial year, where they intentionally or negligently:
Amendment 2219 #
Proposal for a regulation
Article 59 – paragraph 3
Article 59 – paragraph 3
3. Before adopting the decision pursuant to paragraph 2, the Commission shall communicate its preliminary findings to the very large online platform concerned or other person referred to in Article 52(1) and to the Board.
Amendment 2222 #
Proposal for a regulation
Article 59 – paragraph 3
Article 59 – paragraph 3
3. Before adopting the decision pursuant to paragraph 2, the CommissionBoard shall communicate its preliminary findings to the very large online platform concerned or other person referred to in Article 52(1).
Amendment 2224 #
Proposal for a regulation
Article 59 – paragraph 4
Article 59 – paragraph 4
4. In fixing the amount of the fine, the CommissionBoard shall have regard to the nature, gravity, duration and recurrence of the infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings.
Amendment 2227 #
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
Article 60 – paragraph 1 – introductory part
1. The CommissionBoard may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily global turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to:
Amendment 2229 #
Proposal for a regulation
Article 60 – paragraph 2
Article 60 – paragraph 2
Amendment 2233 #
Proposal for a regulation
Article 61 – paragraph 3 – point a
Article 61 – paragraph 3 – point a
(a) requests for information by the Commission, the Board or by a Digital Services Coordinator;
Amendment 2237 #
Proposal for a regulation
Article 61 – paragraph 4
Article 61 – paragraph 4
4. Each interruption shall start time running afresh. However, the limitation period for the imposition of fines or periodic penalty payments shall expire at the latest on the day on which a period equal to twice the limitation period has elapsed without the Commission having imposed a fine or a periodic penalty payment. That period shall be extended by the time during which the limitation period is suspended pursuant to paragraph 5.
Amendment 2240 #
Proposal for a regulation
Article 61 – paragraph 5
Article 61 – paragraph 5
5. The limitation period for the imposition of fines or periodic penalty payments shall be suspended for as long as the decision of the CommissionBoard is the subject of proceedings pending before the Court of Justice of the European Union.
Amendment 2241 #
Proposal for a regulation
Article 62
Article 62
Amendment 2242 #
Proposal for a regulation
Article 62 – title
Article 62 – title
Amendment 2243 #
Proposal for a regulation
Article 62 – paragraph 1
Article 62 – paragraph 1
1. The power of the Commission to enforce decisions taken pursuant to Articles 59enalties decided upon shall be enforceable immediately and 60 shall be subject to a limitation period of five yearsenforced without delay by the Commission.
Amendment 2246 #
Proposal for a regulation
Article 63 – paragraph 1 – introductory part
Article 63 – paragraph 1 – introductory part
1. Before adopting a decision pursuant to Articles 58(1), 59 or 60, the CommissionBoard shall give the very large online platform concerned or other person referred to in Article 52(1) the opportunity of being heard on:
Amendment 2249 #
Proposal for a regulation
Article 63 – paragraph 1 – point a
Article 63 – paragraph 1 – point a
(a) preliminary findings of the CommissionBoard, including any matter to which the CommissionBoard has taken objections; and
Amendment 2251 #
Proposal for a regulation
Article 63 – paragraph 1 – point b
Article 63 – paragraph 1 – point b
(b) measures that the CommissionBoard may intend to take in view of the preliminary findings referred to point (a).
Amendment 2253 #
Proposal for a regulation
Article 63 – paragraph 2
Article 63 – paragraph 2
2. The very large online platform concerned or other person referred to in Article 52(1) may submit their observations on the CommissionBoard’s preliminary findings within a reasonable time period set by the CommissionBoard in its preliminary findings, which may not be less than 14 days.
Amendment 2255 #
Proposal for a regulation
Article 63 – paragraph 3
Article 63 – paragraph 3
3. The CommissionBoard shall base its decisions only on objections on which the parties concerned have been able to comment.
Amendment 2256 #
Proposal for a regulation
Article 63 – paragraph 4
Article 63 – paragraph 4
4. The rights of defence of the parties concerned shall be fully respected in the proceedings. They shall be entitled to have access to the Board’s and the Commission's file under the terms of a negotiated disclosure, subject to the legitimate interest of the very large online platform concerned or other person referred to in Article 52(1) in the protection of their business secrets. The right of access to the file shall not extend to confidential information and internal documents of the Commission, the Board or Member States’ authorities. In particular, the right of access shall not extend to correspondence between the Commission, the Board and those authorities. Nothing in this paragraph shall prevent the Commission or the Board from disclosing and using information necessary to prove an infringement.
Amendment 2260 #
Proposal for a regulation
Article 64 – paragraph 1
Article 64 – paragraph 1
1. The CommissionBoard shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed.
Amendment 2262 #
Proposal for a regulation
Article 65 – paragraph 1 – subparagraph 1
Article 65 – paragraph 1 – subparagraph 1
Where all powers pursuant to this Article to bring about the cessation of an infringement of this Regulation have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the Commission or the Board may request the Digital Services Coordinator of establishment of the very large online platform concerned to act pursuant to Article 41(3).
Amendment 2263 #
Proposal for a regulation
Article 65 – paragraph 1 – subparagraph 2
Article 65 – paragraph 1 – subparagraph 2
Prior to making such request to the Digital Services Coordinator, the Commission or the Board shall invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing the measures ithey intends to request and identifying the intended addressee or addressees thereof.
Amendment 2265 #
Proposal for a regulation
Article 65 – paragraph 1 – subparagraph 2
Article 65 – paragraph 1 – subparagraph 2
Prior to making such request to the Digital Services Coordinator, the Commission shall invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing14 days, the measures it intends to request and identifying the intended addressee or addressees thereof.
Amendment 2266 #
Proposal for a regulation
Article 65 – paragraph 2 – subparagraph 1
Article 65 – paragraph 2 – subparagraph 1
Where the coherent application of this Regulation so requires, the Commission or the Board, acting on its own initiative, may submit written observations to the competent judicial authority referred to Article 41(3). With the permission of the judicial authority in question, ithey may also make oral observations.
Amendment 2268 #
Proposal for a regulation
Article 65 – paragraph 2 – subparagraph 2
Article 65 – paragraph 2 – subparagraph 2
For the purpose of the preparation of itstheir observations only, the Commission or the Board may request that judicial authority to transmit or ensure the transmission to ithem of any documents necessary for the assessment of the case.
Amendment 2280 #
Proposal for a regulation
Article 68 – paragraph 1 – introductory part
Article 68 – paragraph 1 – introductory part
Without prejudice to Directive 2020/XX/EU of the European Parliament and of the Council, recipients of intermediary services shall have the right to mandate a body, organisation or association to exercise the rights referred to in Articles 17, 18 and 19 on their behalf, provided the body, organisation or association meets all of the following conditions: __________________ 52 [Reference] 52
Amendment 2285 #
Proposal for a regulation
Article 69 – paragraph 4
Article 69 – paragraph 4
4. As soon as it adopts a delegated act, the Commission shall notify it simultaneously to the European Parliament and to the Council, the Council, the Board and the Digital Services Coordinators.
Amendment 2287 #
Proposal for a regulation
Article 70 – paragraph 1
Article 70 – paragraph 1
1. The Commission shall be assisted by the Digital Services Committee. That Committee shall be a Committee within the meaning of Regulation (EU) No 182/2011Digital Services Committee shall be assisted by the Commission in all its work and prerogatives.
Amendment 2288 #
Proposal for a regulation
Article 70 – paragraph 2
Article 70 – paragraph 2