BETA

Activities of Emmanuel MAUREL related to 2020/0361(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
2021/09/30
Committee: JURI
Dossiers: 2020/0361(COD)
Documents: PDF(557 KB) DOC(337 KB)
Authors: [{'name': 'Geoffroy DIDIER', 'mepid': 190774}]

Amendments (166)

Amendment 93 #
Proposal for a regulation
Recital 2 a (new)
(2a) whereas there exists a multilateral agreement entitled 'Information and Democracy Partnership', signed by 21 Member States.
2021/07/20
Committee: JURI
Amendment 94 #
Proposal for a regulation
Recital 2 b (new)
(2b) whereas multilateral agreements can provide common solutions for issues covered by this Regulation. Encourages regulation of the public information and communication area by establishing democratic guarantees for the digital space which are based on the liability of platforms and guarantees of the reliability of the information.
2021/07/20
Committee: JURI
Amendment 95 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom. The fundamental rights shall include the right to freedom of expression and information, the right to respect for private and family life, the right to the protection of personal data, the right to cnonduct a business-discrimination, the right to human dignity, children's rights, the right to protection of intellectual property, and the right to non- discrimination. for persons affected by illegal content.
2021/07/20
Committee: JURI
Amendment 98 #
Proposal for a regulation
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers, ensuring enhanced consumer protection and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated, and citizens' fundamental rights would also be respected.
2021/07/20
Committee: JURI
Amendment 103 #
Proposal for a regulation
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) No 1215/2012 of the European Parliament and of the Council27. On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. _________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).
2021/07/20
Committee: JURI
Amendment 109 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particularsuch as Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28, and Regulation (EU) …/... of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well asThis Regulation should not prejudice the freedom of the Member States to regulate issues on which those other acts leave Member States the possibility of adopting certain measures at national level. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/07/20
Committee: JURI
Amendment 110 #
Proposal for a regulation
Recital 9 a (new)
(9a) This Regulation should have no impact on Member States' jurisdiction in the area of culture, nor should it prejudice national measures to guarantee and promote freedom of expression and information, media freedom and pluralism, and cultural diversity.
2021/07/20
Committee: JURI
Amendment 111 #
Proposal for a regulation
Recital 10
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31, Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in line with the quality criteria set out in Directive 2013/11/EU of the European Parliament and of the Council33 a which deals with alternative dispute resolution for consumer disputes, in particular Directive 2005/29/EC of the European Parliament and of the Council34, Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36, as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37, and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38. The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law and national rules on working conditions, collective agreements and social security systems. _________________ 30Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). 31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). 32Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), (OJ L 201, 31.7.2002, p. 37). 33Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. 33 a Directive 2013/11/EU on alternative dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004 and Directive 2009/22/EC. 34Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’). 35Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. 36Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. 37Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules. 38Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
2021/07/20
Committee: JURI
Amendment 125 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, t. That concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material, the provision of illegal services such as hosting services on short-term accommodation rental platforms which do not conform to Union or national law, or activities involving infringements of consumer protection law. In this regard, it is immaterial whether, where the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in questionthe Charter.
2021/07/20
Committee: JURI
Amendment 130 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such asearch engines, social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/07/20
Committee: JURI
Amendment 135 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to meanAs a result, where access to information requires registration or admittance to a group of users, that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39such as emails or private messaging services, fall outside the scope of this Regulationshould be considered to have been disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision on who should be granted access. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic CommunFile-sharing services and other cloud services fall within the scope of this Regulation, to the extent that such services are used to make the stored information available to the public ations Code (Recast), OJ L 321, 17.12.2018, p. 36 the direct request of the content provider.
2021/07/20
Committee: JURI
Amendment 138 #
Proposal for a regulation
Recital 15 a (new)
(15a) The online activities of a person allow for deep insights into their personality as well as their past and future behaviour, making it possible to manipulate them. The high sensitivity of such information and its potential for abuse requires special protection. The general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy. Users should therefore have a right not to be subject to pervasive tracking when using information society services. To this end, the processing of personal data concerning the use of services should be limited to the extent strictly necessary to provide the service and to bill the users. Processing personal data for displaying advertisements must be prohibited. Following the jurisprudence on communications, meta- data providers should not be required to indiscriminately retain personal data concerning the use of the service by all recipients. Applying effective end-to-end encryption to data is essential for trust in and security on the internet and effectively prevents unauthorised third- party access. The fact that encryption technology is abused by some for illegal purposes does not justify generally weakening effective end-to-end encryption.
2021/07/20
Committee: JURI
Amendment 142 #
Proposal for a regulation
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and, automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information or to optimise the way it is presented or to prioritise it, whether or not this is done in an automated way. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
2021/07/20
Committee: JURI
Amendment 145 #
Proposal for a regulation
Recital 20
(20) A provider of intermediary services, that deliberately collaborates with a recipient of the services in order to undere main purpose of which is to conduct or facilitakte illegal activities does not provide its service neutrally and should therefore, should not be able to benefit from the exemptions from liability provided for in this Regulation.
2021/07/20
Committee: JURI
Amendment 148 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expressionall relevant rights, including freedom of expression, the right to intellectual property and the freedom to pursue a commercial activity. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/07/20
Committee: JURI
Amendment 152 #
Proposal for a regulation
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms and other providers of services such as marketplaces that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
2021/07/20
Committee: JURI
Amendment 154 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.deleted
2021/07/20
Committee: JURI
Amendment 160 #
Proposal for a regulation
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in questionIn many cases, providers of intermediary services are best placed to solve the problem of illegal content and activities by removing or blocking access to such content, particularly at the request of third parties affected by the illegal content transmitted or stored online. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. Parties with the technical and operational capacity to take action against illegal content must therefore ensure that third parties can identify them easily and contact them in order to combat illegal content.
2021/07/20
Committee: JURI
Amendment 164 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case or with regard to content which is identical or equivalent to content which has previously been withdrawn because it was illegal; and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact- finding obligation, or as a general obligimpeding providers from taking proactive measures to identify and remove illegal content and to prevent its reappearance. This Regulation is without prejudice to the Member States requiring providers of services which host the information for providersd to take proactive measures to relation to illegal contentapply the obligations set out in national law, in order to prevent illegal activity.
2021/07/20
Committee: JURI
Amendment 187 #
Proposal for a regulation
Recital 35
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.
2021/07/20
Committee: JURI
Amendment 195 #
Proposal for a regulation
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 . _________________ 40Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, smallAll decisions, rules and penalties concerning content moderation must be clear, precise and mpredium-sized enterprises (OJ L 124, 20.5.2003, p. 36)ctable for users.
2021/07/20
Committee: JURI
Amendment 200 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. Online platforms must prevent the reappearance of content already identified as illegal and which was withdrawn on the basis of a prior opinion. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
2021/07/20
Committee: JURI
Amendment 207 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove, restrict or disable proposals by recommender systems of information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider shouldmust do all in its power to prevent the reappearance of the notified illegal information. This must be done particularly in the case of providers hosting large amounts of illegal content. The provider shall inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. Proposals by recommender systems can be restricted by means, for example, of shadow-banning content.
2021/07/20
Committee: JURI
Amendment 217 #
Proposal for a regulation
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41, unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small, small and medium-sized enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/20
Committee: JURI
Amendment 228 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities canThey may be public in naturebodies, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or; they can also be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry, legal persons and of right- holder organisations could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 . _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, (OJ L 135, 24.5.2016, p. 53).
2021/07/20
Committee: JURI
Amendment 230 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequentpeatedly providing manifestlyillegal content, facilitating the repeated uploading of illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend or terminate their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/20
Committee: JURI
Amendment 251 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
2021/07/20
Committee: JURI
Amendment 254 #
Proposal for a regulation
Recital 54
(54) Very lLarge online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
2021/07/20
Committee: JURI
Amendment 255 #
Proposal for a regulation
Recital 56
(56) Very lLarge online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising- driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
2021/07/20
Committee: JURI
Amendment 257 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including dangerous and counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through advertising, recommender systems or through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to personal data protection, the right to non- discrimination and, the rights of the child and the right to respect for intellectual property. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech, circumventing applicable laws or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/20
Committee: JURI
Amendment 264 #
Proposal for a regulation
Recital 58
(58) Very lLarge online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platformsThey should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They mayshould also include corrective measures, such as discontinuing advertising revenue, restricting or prohibiting advertising for specific content, or other actions, such ascontent aimed at minors, or improving the visibility of authoritative information sources. Very lLarge online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/20
Committee: JURI
Amendment 268 #
Proposal for a regulation
Recital 59
(59) Very lLarge online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations.
2021/07/20
Committee: JURI
Amendment 270 #
Proposal for a regulation
Recital 60
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertakingen pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Aresearchers vetted by the competent authorities. With the exception of the authorities or unless otherwise specified by the applicable law, auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Very large online platforms should not, however, be able to use the confidentiality of trade secrets as reasons to refuse access to relevant information needed by auditors to perform their tasks. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
2021/07/19
Committee: JURI
Amendment 271 #
Proposal for a regulation
Recital 61
(61) The audit report should be independent and substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation and other applicable laws. The report should be transmitted to the relevant Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
2021/07/19
Committee: JURI
Amendment 276 #
Proposal for a regulation
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
2021/07/19
Committee: JURI
Amendment 281 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very lLarge online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
2021/07/19
Committee: JURI
Amendment 285 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data or algorithms. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/19
Committee: JURI
Amendment 288 #
Proposal for a regulation
Recital 65
(65) Given the complexity of the functioning of the systems deployed and the systemic risks they present to society, very large online platforms should appoint compliance officers, which should have the necessary qualifications to operationalise measures and monitor the compliance with this Regulation within the platform’s organisation. Very lLarge online platforms should ensure that the compliance officer is involved, properly and in a timely manner, in all issues which relate to this Regulation. In view of the additional risks relating to their activities and their additional obligations under this Regulation, the other transparency requirements set out in this Regulation should be complemented by additional transparency requirements applicable specifically to very large online platforms, notably to report on the risk assessments performed and subsequent measures adopted as provided by this Regulation.
2021/07/19
Committee: JURI
Amendment 291 #
Proposal for a regulation
Recital 66
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate.
2021/07/19
Committee: JURI
Amendment 293 #
Proposal for a regulation
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.deleted
2021/07/19
Committee: JURI
Amendment 297 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.deleted
2021/07/19
Committee: JURI
Amendment 300 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self- regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan.deleted
2021/07/19
Committee: JURI
Amendment 304 #
Proposal for a regulation
Recital 70
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives.deleted
2021/07/19
Committee: JURI
Amendment 306 #
Proposal for a regulation
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross- border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.deleted
2021/07/19
Committee: JURI
Amendment 310 #
Proposal for a regulation
Recital 73
(73) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be identified as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure effective involvement of all relevant authorities, particularly the independent national media regulation authorities, in the supervision and enforcement at Union level.
2021/07/19
Committee: JURI
Amendment 311 #
Proposal for a regulation
Recital 74
(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including authorities responsible for consumer protection, market surveillance, data protection, law enforcement authorities or crisis management authorities, where appropriate.
2021/07/19
Committee: JURI
Amendment 312 #
Proposal for a regulation
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. AWith the exception of complaints and legal actions involving consumers or launched by consumers or independent organisations representing consumers, or by companies or creators seeking respect for intellectual property, a provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered officein the Union within which the principal financial functions and operational control of compliance with this Regulation are exercised. IThe main respectablishment of a providers that do not have of intermediary services should be determined according to objective criteria and should involve real and establishment in the Union but that offer services ffective management action to determine the main decisions governing the Union and therefore fall within the scope offight against illegal online content and activities and compliance with this Regulation, in the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives underUnion. Authorities in the Member State(s) where consumers are affected shall be responsible for enforcing this Regulation. In the interest of the swift and effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed towith no establishment in the Union but which provide services in the Union and therefore fall within the scope of this Regulation, irrespective of the place where they designate or do not designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. and should publish these measures.
2021/07/19
Committee: JURI
Amendment 321 #
Proposal for a regulation
Recital 87
(87) In view of the particular challenges that may emerge in relation to assessing and ensuring a very large online platform’s compliance, for instance relating to the scale or complexity of a suspected infringement or the need for particular expertise or capabilities at Union level, Digital Services Coordinators should have the possibility to request, on a voluntary basis, the Commission to intervene and exercise its investigatory and enforcement powers under this Regulation.
2021/07/19
Committee: JURI
Amendment 323 #
Proposal for a regulation
Recital 89
(89) The Board should contribute to achieving a common Union perspective on the consistent application of this Regulation and to cooperation among competent authorities, including by advising the Commission and the Digital Services Coordinators about appropriate investigation and enforcement measures, in particular vis à vis very large online platforms. The Board should also contribute to the drafting of relevant templates and codes of conduct and analyse emerging general trends in the development of digital services in the Union.
2021/07/19
Committee: JURI
Amendment 324 #
Proposal for a regulation
Recital 90
(90) For that purpose, the Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, tThe decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation.
2021/07/19
Committee: JURI
Amendment 325 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, respect for intellectual property, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/07/19
Committee: JURI
Amendment 328 #
Proposal for a regulation
Recital 92
(92) The Commission, through the Chair, should participate in the Board without voting rights. Through the Chair, the Commission should ensure that the agenda of the meetings is set in accordance with the requests of the members of the Board as laid down in the rules of procedure and in compliance with the duties of the Board laid down in this Regulation.
2021/07/19
Committee: JURI
Amendment 329 #
Proposal for a regulation
Recital 94
(94) Given the importance of very large online platforms, in view of their reach and impact, their failure to comply with the specific obligations applicable to them may affect a substantial number of recipients of the services across different Member States and may cause large societal harms, while such failures may also be particularly complex to identify and address.
2021/07/19
Committee: JURI
Amendment 331 #
Proposal for a regulation
Recital 95
(95) In order to address those public policy concerns it is therefore necessary to provide for a common system of enhanced supervision and enforcement at Union level. Once an infringement of one of the provisions that solely apply to very large online platforms has been identified, for instance pursuant to individual or joint investigations, auditing or complaints, the Digital Services Coordinator of establishment, upon its own initiative or upon the Board’s advice, should monitor any subsequent measure taken by the very large online platform concerned as set out in its action plan. That Digital Services Coordinator should be able to ask, where appropriate, for an additional, specific audit to be carried out, on a voluntary basis, to establish whether those measures are sufficient to address the infringement. At the end of that procedure, it should inform the Board, the Commission and the platform concerned of its views on whether or not that platform addressed the infringement, specifying in particular the relevant conduct and its assessment of any measures taken. The Digital Services Coordinator should perform its role under this common system in a timely manner and taking utmost account of any opinions and other advice of the Board.
2021/07/19
Committee: JURI
Amendment 333 #
Proposal for a regulation
Recital 96
(96) Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission may, on its own initiative or upon advice of the Board, decide to further investigate the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the Commission should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also have such a possibility to intervene in cross-border situations where the Digital Services Coordinator of establishment did not take any measures despite the Commission’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform.
2021/07/19
Committee: JURI
Amendment 335 #
Proposal for a regulation
Recital 97
(97) The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary.
2021/07/19
Committee: JURI
Amendment 339 #
Proposal for a regulation
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the Commission should have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties.
2021/07/19
Committee: JURI
Amendment 342 #
Proposal for a regulation
Recital 99
(99) In particular, the Commission should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers
2021/07/19
Committee: JURI
Amendment 343 #
Proposal for a regulation
Recital 101
(101) The very large online platforms concerned and other persons subject to the exercise of the Commission’s powers whose interests may be affected by a decision should be given the opportunity of submitting their observations beforehand, and the decisions taken should be widely publicised. While ensuring the rights of defence of the parties concerned, in particular, the right of access to the file, it is essential that confidential information be protected. Furthermore, while respecting the confidentiality of the information, the Commission should ensure that any information relied on for the purpose of its decision is disclosed to an extent that allows the addressee of the decision to understand the facts and considerations that lead up to the decision.
2021/07/19
Committee: JURI
Amendment 345 #
Proposal for a regulation
Recital 104
(104) In order to fulfil the objectives of this Regulation, the power to adopt acts in accordance with Article 290 of the Treaty should be delegated to the Commission to supplement this Regulation. In particular, delegated acts should be adopted in respect of criteria for identification of very large online platforms and of technical specifications for access requests. It is of particular importance that the Commission carries out appropriate consultations and that those consultations be conducted in accordance with the principles laid down in the Interinstitutional Agreement on Better Law-Making of 13 April 2016. In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States' experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts.
2021/07/19
Committee: JURI
Amendment 347 #
Proposal for a regulation
Recital 106 a (new)
(106 a)The 'attention-seeking' profiling business model of digital markets, in which algorithms prioritise controversial content and thereby contribute to its online dissemination, undermines consumer faith in digital markets. This Regulation should therefore put an end to this practice and give users greater control over the way in which rankings are presented;
2021/07/19
Committee: JURI
Amendment 349 #
Proposal for a regulation
Article premier – paragraph 1 – point a
(a) a framework for the conditional exemption from liability of providers of intermediary services;
2021/07/19
Committee: JURI
Amendment 352 #
Proposal for a regulation
Article premier – paragraph 2 – point b a (new)
(ba) protect minors making use of services falling under this Regulation.
2021/07/19
Committee: JURI
Amendment 353 #
Proposal for a regulation
Article premier – paragraph 2 – point b b (new)
(bb) protect consumers making use of services falling under this Regulation.
2021/07/19
Committee: JURI
Amendment 354 #
Proposal for a regulation
Article premier – paragraph 3
3. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services.
2021/07/19
Committee: JURI
Amendment 357 #
Proposal for a regulation
Article premier – paragraph 5 – introductory part
5. This Regulation is without prejudice toshall not affect the rules laid down by the following:
2021/07/19
Committee: JURI
Amendment 362 #
Proposal for a regulation
Article premier – paragraph 5 – point c
(c) Union law on copyright and related rights in line with Member State legislation;
2021/07/19
Committee: JURI
Amendment 367 #
Proposal for a regulation
Article premier – paragraph 5 a (new)
5a. This Regulation does not affect Member States' jurisdiction concerning measures they might take to promote cultural and linguistic diversity and ensure media freedom and pluralism.
2021/07/19
Committee: JURI
Amendment 371 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union; in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as:such a substantial connection is established where the activities are conducted in one or more Member States.
2021/07/19
Committee: JURI
Amendment 372 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
— a significant number of users in one or more Member States; ordeleted
2021/07/19
Committee: JURI
Amendment 374 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
— the targeting of activities towards one or more Member States.deleted
2021/07/19
Committee: JURI
Amendment 376 #
Proposal for a regulation
Article 2 – paragraph 1 – point e
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person actingmarketing products and/or services in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;
2021/07/19
Committee: JURI
Amendment 378 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3 a (new)
- an online search engine as defined in point (5) of Article 2 of Regulation (EU) 2019/1150.
2021/07/19
Committee: JURI
Amendment 395 #
Proposal for a regulation
Article 2 – paragraph 1 – point i
(i) ‘dissemination to the public’ means making information available, at the request of the recipient of the service who provided the information, to a larger or a potentially unlimited number of third parties;
2021/07/19
Committee: JURI
Amendment 398 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against direct or indirect remuneration specifically for promoting that information;
2021/07/19
Committee: JURI
Amendment 399 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, classify, prioritise or organise in its online interface specific information tofor recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
2021/07/19
Committee: JURI
Amendment 403 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) 'overriding reasons of general interest' means reasons recognised as such by the case-law of the Court of Justice of the European Union, such as: public policy; public security; public health; the preservation of the financial equilibrium of the social security system; the protection of consumers, recipients of services and workers; the protection of young people; the fairness of trade transactions; the fight against fraud; the protection of the natural and urban environment; animal welfare; intellectual property; the conservation of national historic and artistic heritage; social and cultural policy objectives; housing.
2021/07/19
Committee: JURI
Amendment 407 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(qb) 'online marketplaces' means a service using software, including a website, part of a website or an application, operated by or on behalf of a trader which allows consumers to conclude distance contracts with other traders or consumers, in accordance with Directive 2005/29/EC.
2021/07/19
Committee: JURI
Amendment 420 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously and decisively to remove or to disable access to the illegal content.
2021/07/19
Committee: JURI
Amendment 422 #
Proposal for a regulation
Article 5 – paragraph 2
2. Paragraph 1 shall not apply: (a) where the recipient of the service is acting under the authority or the control of the provider. (b) to marketplaces in line with Article 25 and which do not comply with Articles 11, 13, 14(1), 19(1), 22, 24 and 29 in terms of mandatory due diligence requirements. (c) to large platforms as described in Article 25 if they do not comply with the obligations set out in Article 9 of this Regulation.
2021/07/19
Committee: JURI
Amendment 424 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
2021/07/19
Committee: JURI
Amendment 427 #
Proposal for a regulation
Article 5 – paragraph 3 a (new)
3a. Paragraph 1 shall not apply to liability of marketplaces for illegal content which they host on their platforms.
2021/07/19
Committee: JURI
Amendment 428 #
Proposal for a regulation
Article 5 – paragraph 3 b (new)
3b. Paragraph 1 shall not apply where the provider of intermediary services plays an active role in providing, optimising, classifying or organising the referencing or promotion of the content.
2021/07/19
Committee: JURI
Amendment 432 #
Proposal for a regulation
Article 6 – paragraph 1
PWhere providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiativeare aware of illegal activity or content as a result of investigations or oundertaken on their activities aimed at detecting, identifying and removing, or disown initiative, they shall not be eligible for exemptions from liabiling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulationty. They must therefore act as quickly as possible to withdraw or disable access to such activity or content.
2021/07/19
Committee: JURI
Amendment 527 #
Proposal for a regulation
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in oneeach of the Member States where the provider offers its services.
2021/07/19
Committee: JURI
Amendment 533 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall ensure that their terms and conditions prohibit the recipients of their services from providing content that is not in compliance with Union law or the law of the Member State where such content is made available. The terms and conditions shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
2021/07/19
Committee: JURI
Amendment 544 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Providers of intermediary services shall not take down, disable or interfere in any other way with the editorial content and the services made available by a provider, which has editorial liability for that content and is subject to rules, which comply with Union and national law. This article shall not affect the possibility of an independent judicial or administrative authority requiring the content provider to terminate or prevent a breach of applicable Union or national law.
2021/07/19
Committee: JURI
Amendment 568 #
Proposal for a regulation
Article 13 – paragraph 1 – point b a (new)
(ba) Providers of intermediary services shall ensure that trade users are registered as such and not as private users and that they are subject to the obligations arising from this status. To this end, providers of intermediary services must take account of the relevant criteria, in particular sales volumes and sales revenue, and any other relevant criteria based on national law.
2021/07/19
Committee: JURI
Amendment 572 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1a. Providers of intermediary services shall ensure that the identities of trade users providing goods or services on intermediary services are clearly visible alongside the goods or services provided.
2021/07/19
Committee: JURI
Amendment 574 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.deleted
2021/07/19
Committee: JURI
Amendment 596 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
2021/07/19
Committee: JURI
Amendment 605 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned and give rise to an obligation for the provider notified quickly to remove access to the notified content and disable access thereto.
2021/07/19
Committee: JURI
Amendment 612 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. swift and objective manner and in any case within a maximum of 72 hours. Where decisions on the removal or deactivation of access to content are taken, providers of hosting services shall take all measures necessary to prevent the same illegal content or equivalent illegal content from reappearing on their service. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. This means, in particular, key information on the procedure followed, the technology used, the criteria and reasoning underpinning the decision and the rationale behind any automated decision-making.
2021/07/19
Committee: JURI
Amendment 618 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Where providers of hosting services have previously taken down, removed or deactivated access to illegal content as a result of a notice and a valid claim procedure which did not lead to a successful appeal, they shall take all reasonable, proportional action to block, deactivate or permanently take down the illegal content or any identical content.
2021/07/19
Committee: JURI
Amendment 621 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
6b. The taking down, removal or deactivation of access as defined in Article 14(6a) may be annulled by the following measures: a successful appeal, or a judicial ruling by a court with jurisdiction in a Member State, the General Court or the Court of Justice of the European Union.
2021/07/19
Committee: JURI
Amendment 622 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
6c. Providers of hosting services shall, without delay, inform consumers who have purchased illegal products between the time when such products were placed online on the website of the provider and the time when the listing was removed by the platform following a valid notice.
2021/07/19
Committee: JURI
Amendment 623 #
Proposal for a regulation
Article 14 – paragraph 6 d (new)
6d. This article shall not apply to editorial content provided by a trader assuming editorial responsibility for that content and complying with rules which are in line with community and national law.
2021/07/19
Committee: JURI
Amendment 625 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, or to limit the visibility of, suspend or put a stop to monetary payments linked to such content, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, of the decision or of the restriction of visibility or the suspension or ceasing of payments at the latest at the time of the removal or disabling of access, of the decision and and shall provide a clear and specific statement of reasons for that decision.
2021/07/19
Committee: JURI
Amendment 631 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the information, or the limiting of its visibility or the ceasing of its monetisation, and, where relevant, the territorial scope of the disabling of access;
2021/07/19
Committee: JURI
Amendment 635 #
Proposal for a regulation
Article 15 – paragraph 4
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission which is accessible to national and European authorities. That information shall not include personal data.
2021/07/19
Committee: JURI
Amendment 648 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service and third parties which have made a referral, for a period of at least six months following the decision referred to in this paragraph, withe access to an effective internal complaint- handling system, which enables the complaints to be lodged, electronically and free of charge, against decisions taken by the online platform to not take action following receipt of a notice and against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/07/19
Committee: JURI
Amendment 652 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or disable access to the informationthe information, limit its visibility, suspend the possibility of purchase or rental, or disable access to it;
2021/07/19
Committee: JURI
Amendment 673 #
Proposal for a regulation
Article 17 – paragraph 4
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants and the individual or bodies which submitted a referral linked to the complainant’s request of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities.
2021/07/19
Committee: JURI
Amendment 700 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof. Out-of-court dispute settlement procedures should preferably be free of charge for consumers. If a charge is made, the procedure should be accessible and inexpensive for consumers. To this end, such charges should not exceed a symbolic amount.
2021/07/19
Committee: JURI
Amendment 709 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms and providers of hosting services shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
2021/07/19
Committee: JURI
Amendment 738 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform or a provider of hosting services has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiatedwrongful notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
2021/07/19
Committee: JURI
Amendment 740 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform or a provider of hosting services pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
2021/07/19
Committee: JURI
Amendment 747 #
Proposal for a regulation
Article 20 – paragraph 1
1. OProviders of hosting services and online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequentpeatedly provide manifestly illegal content or enable such content to be made available.
2021/07/19
Committee: JURI
Amendment 750 #
Proposal for a regulation
Article 20 – paragraph 1 a (new)
1a. Without prejudice to Article 4 of the P2B Regulation, providers of hosting services shall do all in their power to ensure that users which have been suspended from the service cannot use it again until such time as the suspension is lifted. Online platforms shall stop providing their services to trade users which repeatedly provide illegal content and which have previously been suspended. Where an online platform stops providing its services to a trade user, it shall provide that user, at least 15 days before the termination comes into force, with the reasons for its decision and shall inform it of the possibility to challenge the decision under Article 17.
2021/07/19
Committee: JURI
Amendment 752 #
Proposal for a regulation
Article 20 – paragraph 2
2. OProviders of hosting services and online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
2021/07/19
Committee: JURI
Amendment 755 #
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
3. OProviders of hosting services and online platforms shall assess, on a case-by- case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following:
2021/07/19
Committee: JURI
Amendment 759 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
2021/07/19
Committee: JURI
Amendment 762 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
3a. In cases of repeated suspension, providers of hosting services shall terminate the provision of their services to the recipients of those services.
2021/07/19
Committee: JURI
Amendment 764 #
Proposal for a regulation
Article 20 – paragraph 3 b (new)
3b. Providers of hosting services shall establish mechanisms to prevent the re- registration of recipients of services which repeatedly provide or enable the provision of illegal content.
2021/07/19
Committee: JURI
Amendment 765 #
Proposal for a regulation
Article 20 – paragraph 4
4. OProviders of hosting services and online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
2021/07/19
Committee: JURI
Amendment 768 #
Proposal for a regulation
Article 21 – paragraph 1
1. Where an online platform or an online service provider becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.
2021/07/19
Committee: JURI
Amendment 771 #
Proposal for a regulation
Article 21 – paragraph 2 – introductory part
2. Where the online platform or an online service provider cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol.
2021/07/19
Committee: JURI
Amendment 778 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform or an online service provider allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
2021/07/19
Committee: JURI
Amendment 788 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platform or the online service provider shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources.
2021/07/19
Committee: JURI
Amendment 790 #
Proposal for a regulation
Article 22 – paragraph 3 – introductory part
3. Where the online platform or the online service provider obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/07/19
Committee: JURI
Amendment 794 #
Proposal for a regulation
Article 22 – paragraph 3 a (new)
3a. The online platform or the online service provider shall apply identification and verification measures not only to new corporate clients but they shall also conduct a check and update the information they hold on existing corporate clients at least once a year.
2021/07/19
Committee: JURI
Amendment 798 #
Proposal for a regulation
Article 22 – paragraph 4
4. The online platform or the online service provider shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information.
2021/07/19
Committee: JURI
Amendment 801 #
Proposal for a regulation
Article 22 – paragraph 5
5. Without prejudice to paragraph 2, the platform or the online service provider shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation.
2021/07/19
Committee: JURI
Amendment 804 #
Proposal for a regulation
Article 22 – paragraph 6
6. The online platform or the online service provider shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner.
2021/07/19
Committee: JURI
Amendment 808 #
Proposal for a regulation
Article 22 – paragraph 7
7. The online platform or the online service provider shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law.
2021/07/19
Committee: JURI
Amendment 858 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very lLarge online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/19
Committee: JURI
Amendment 868 #
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, human dignity, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
2021/07/19
Committee: JURI
Amendment 876 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/07/19
Committee: JURI
Amendment 883 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very lLarge online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/07/19
Committee: JURI
Amendment 884 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions; the rules and penalties for all decisions on content moderation must be clear, specific and applicable for users;
2021/07/19
Committee: JURI
Amendment 894 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
(e) initiating or adjusting cooperation with other online platforms through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively.deleted
2021/07/19
Committee: JURI
Amendment 906 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to mitigate the systemic risks identified.
2021/07/19
Committee: JURI
Amendment 912 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1. Very lLarge online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:
2021/07/19
Committee: JURI
Amendment 925 #
Proposal for a regulation
Article 28 – paragraph 4
4. Very lLarge online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
2021/07/19
Committee: JURI
Amendment 927 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large oOnline platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at leas. Online platforms shall ensure that cone option which is not based on profisumer profiling does not take place by default, unless this is the preferred option of the latter, in ling,e within the meaning of Article 4 (4) of Regulation (EU) 2016/679. requirements set out in Regulation (EU) 2016/679. Online platforms shall not distort or impede the autonomy, decision-making or choice of consumers through any structure, function or modus operandi of their online interface or any part of that interface.
2021/07/19
Committee: JURI
Amendment 941 #
Proposal for a regulation
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
2021/07/19
Committee: JURI
Amendment 949 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very lLarge online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/19
Committee: JURI
Amendment 965 #
Proposal for a regulation
Article 31 – paragraph 1
1. Very lLarge online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, and within a maximum of 72 hours, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
2021/07/19
Committee: JURI
Amendment 970 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1).
2021/07/19
Committee: JURI
Amendment 974 #
Proposal for a regulation
Article 31 – paragraph 3
3. Very lLarge online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate.
2021/07/19
Committee: JURI
Amendment 978 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/07/19
Committee: JURI
Amendment 980 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
6. Within 153 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons:.
2021/07/19
Committee: JURI
Amendment 982 #
Proposal for a regulation
Article 31 – paragraph 6 – point a
(a) it does not have access to the data;deleted
2021/07/19
Committee: JURI
Amendment 984 #
Proposal for a regulation
Article 31 – paragraph 6 – point b
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets.deleted
2021/07/19
Committee: JURI
Amendment 986 #
7. Requests for amendment pursuant to point (b) of paragraph 6 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request. The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request.deleted
2021/07/19
Committee: JURI
Amendment 990 #
Proposal for a regulation
Article 32 – paragraph 1
1. Very lLarge online platforms shall appoint one or more compliance officers responsible for monitoring their compliance with this Regulation.
2021/07/19
Committee: JURI
Amendment 991 #
Proposal for a regulation
Article 32 – paragraph 2
2. Very lLarge online platforms shall only designate as compliance officers persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned.
2021/07/19
Committee: JURI
Amendment 993 #
Proposal for a regulation
Article 32 – paragraph 4
4. Very lLarge online platforms shall take the necessary measures to ensure that the compliance officers can perform their tasks in an independent manner.
2021/07/19
Committee: JURI
Amendment 995 #
Proposal for a regulation
Article 32 – paragraph 5
5. Very lLarge online platforms shall communicate the name and contact details of the compliance officer to the Digital Services Coordinator of establishment and the Commission.
2021/07/19
Committee: JURI
Amendment 996 #
Proposal for a regulation
Article 32 – paragraph 6
6. Very lLarge online platforms shall support the compliance officer in the performance of his or her tasks and provide him or her with the resources necessary to adequately carry out those tasks. The compliance officer shall directly report to the highest management level of the platform.
2021/07/19
Committee: JURI
Amendment 997 #
Proposal for a regulation
Article 33 – title
Transparency reporting obligations for very large online platforms
2021/07/19
Committee: JURI
Amendment 998 #
Proposal for a regulation
Article 33 – paragraph 1
1. Very lLarge online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months.
2021/07/19
Committee: JURI
Amendment 999 #
Proposal for a regulation
Article 33 – paragraph 2 – introductory part
2. In addition to the reports provided for in Article 13, very large online platforms shall make publicly available and transmit to the Digital Services Coordinator of establishment and the Commission, at least once a year and within 30 days following the adoption of the audit implementing report provided for in Article 28(4):
2021/07/19
Committee: JURI
Amendment 1004 #
Proposal for a regulation
Article 33 – paragraph 3
3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 might result in the disclosure of confidential information of that platform or of the recipients of the service, might cause significant vulnerabilities for the security of its service, might undermine public security or might harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports.deleted
2021/07/19
Committee: JURI
Amendment 1012 #
Proposal for a regulation
Article 35
1. shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. 2. within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. 3. 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. 4. shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. 5. and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.Article 35 deleted Codes of conduct The Commission and the Board Where significant systemic risk When giving effect to paragraphs The Commission and the Board The Board shall regularly monitor
2021/07/19
Committee: JURI
Amendment 1030 #
Proposal for a regulation
Article 36
Codes of conduct for online advertising 1. and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30. 2. ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: (a) the transmission of information held by providers of online advertising intermediaries to recipients of the service with regard to requirements set in points (b) and (c) of Article 24; (b) the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.Article 36 deleted The Commission shall encourage The Commission shall aim to
2021/07/19
Committee: JURI
Amendment 1043 #
Proposal for a regulation
Article 37
[...]deleted
2021/07/19
Committee: JURI
Amendment 1062 #
Proposal for a regulation
Article 42 – paragraph 3
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1 % of the annual income or turnover of the provider concerned. If the offence is repeated, suspension and then prohibition of access to the European market must be considered.
2021/07/19
Committee: JURI
Amendment 1063 #
Proposal for a regulation
Article 42 – paragraph 3
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 correspond to 10% of the annual income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 15% of the annual income or turnover of the provider concerned.
2021/07/19
Committee: JURI
Amendment 1065 #
Proposal for a regulation
Article 43 – title
Right to lodge a complaint and right to judicial remedy
2021/07/19
Committee: JURI
Amendment 1067 #
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established or with any legal or natural person with an interest in acting as a trusted flagger. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority and shall inform the person who submitted the complaint.
2021/07/19
Committee: JURI
Amendment 1075 #
Proposal for a regulation
Article 44 – paragraph 1
1. Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the European Parliament, to the Commission and to the Board.
2021/07/19
Committee: JURI
Amendment 1107 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
(d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measures concerning very large online platforms in accordance with this Regulation;
2021/07/19
Committee: JURI
Amendment 1109 #
Proposal for a regulation
Chapter IV – Section 3 – title
3 Supervision, investigation, enforcement and monitoring in respect of very large online platforms
2021/07/19
Committee: JURI
Amendment 1110 #
Proposal for a regulation
Article 50 – title
Enhanced supervision for very large online platforms
2021/07/19
Committee: JURI
Amendment 1115 #
Proposal for a regulation
Article 50 – paragraph 2
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35.
2021/07/19
Committee: JURI