384 Amendments of Alexandra GEESE related to 2020/0361(COD)
Amendment 192 #
Proposal for a regulation
Recital 4 a (new)
Recital 4 a (new)
(4a) Online advertisement plays an important role in the online environment, including in relation to the provision of the information society services. However, certain forms of online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to creating financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, to misleading or exploitative marketing or the discriminatory display of advertising with an impact on the equal treatment and the rights of consumers. Consumers are largely unaware of the volume and granularity of the data that is being collected and used to deliver personalised and micro-targeted advertisements, and have little agency and limited ways to stop or control data exploitation. The significant reach of a few online platforms, their access to extensive datasets and participation at multiple levels of the advertising value chain has created challenges for businesses, traditional media services and other market participants seeking to advertise or develop competing advertising services. In addition to the information requirements resulting from Article 6 of Directive 2000/31/EC, stricter rules on targeted advertising and micro-targeting are needed, in favour of less intrusive forms of advertising that do not require extensive tracking of the interaction and behaviour of recipients of the service. Therefore, providers of information society services may only deliver and display online advertising to a recipient or a group of recipients of the service when this is done based on contextual information, such as keywords or metadata. Providers should not deliver and display online advertising to a recipient or a clearly identifiable group of recipients of the service that is based on personal or inferred data relating to the recipients or groups of recipients. Where providers deliver and display advertisement, they should be required to ensure that the recipients of the service have certain individualised information necessary for them to understand why and on whose behalf the advertisement is displayed, including sponsored content and paid promotion.
Amendment 198 #
Proposal for a regulation
Recital 5 a (new)
Recital 5 a (new)
(5a) Given the cross-border nature of the services concerned, Union action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and ensure that equal right to access and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for intermediary services, including their online interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals.
Amendment 207 #
Proposal for a regulation
Recital 8
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targedirecting of activities towards one or more Member States. The targedirecting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targedirecting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. __________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
Amendment 215 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. Therefore, Chapter III (Articles 10 to 37) also applies as a horizontal framework mutatis mutandis to intermediary services when implementing other secondary legislation, to the extent no more specific rules are laid down. __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 222 #
Proposal for a regulation
Recital 11
Recital 11
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected.
Amendment 231 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadappropriately and also covers unlawful information directly relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that directly relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, illegally-traded animals the non- authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 247 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Accordingly, where access to information requires registration or admittance to a group of users, that information should be considered to be disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation may, in general, not be considered as a dissemination to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 254 #
Proposal for a regulation
Recital 15 a (new)
Recital 15 a (new)
(15a) Ensuring that providers of intermediary services can offer strong and effective end-to-end encryption is essential for trust in and security of digital services in the Digital Single Market, and effectively prevents unauthorised third- party access.
Amendment 275 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remov act expeditiously to remove or to disable access to content where it is evident to a layperson, without any substantive analysis, that the content is manifestly illegal or where it has become aware orf to disable access tohe unlawful nature of thate content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 289 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
Amendment 294 #
Proposal for a regulation
Recital 25
Recital 25
Amendment 305 #
Proposal for a regulation
Recital 26
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed and open online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.
Amendment 308 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. Domain name system (DNS) registration services can also benefit from the exemptions from liability set out in this Regulation.
Amendment 317 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific cases and therefore, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
Amendment 323 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders. The applicable rules on the mutual recognition of court decisions should be unaffected.
Amendment 366 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of discriminatory, unfair or arbitrary outcomes.
Amendment 370 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report in a standardised and machine- readable format, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40, or as a not-for-profit service with fewer than 100.000 monthly active users. __________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 373 #
Proposal for a regulation
Recital 39 a (new)
Recital 39 a (new)
(39a) Recipients of the service should be empowered to make autonomous decisions inter alia regarding the acceptance of and changes to terms and conditions, advertising practices, privacy and other settings, recommender systems when interacting with intermediary services. However, dark patterns typically exploit cognitive biases and prompt online consumers to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose. Therefore, providers of intermediary services should be prohibited from deceiving or nudging recipients of the service and from subverting or impairing the autonomy, decision- making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof (‘dark patterns’). This includes, but is not limited to, exploitative design choices to direct the recipient to actions that benefit the provider of intermediary services, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, repetitively requesting or pressuring the recipient to make a decision or hiding or obscuring certain options.
Amendment 379 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Content that has been notified and that is not manifestly illegal should remain accessible while the assessment of its legality by the competent authority is still pending. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. Recipients of the service who provided the information to which the notice relates should be given the opportunity to reply before a decision is taken. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 412 #
Proposal for a regulation
Article 2 b (new)
Article 2 b (new)
Amendment 413 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent, accurate and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmentalnon- governmental organisations, consumer protection organisations, and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions onlinor discriminatory expressions online or to combatting digital violence or supporting victims of digital violence. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions and respect for exceptions and limitations to intellectual property rights. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 427 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion, while ensuring a high level of security of the information concerned in order to protect such information against accidental or unlawful destruction, accidental loss or alteration, or unauthorised or unlawful storage, processing, access or disclosure.. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 430 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 439 #
Proposal for a regulation
Recital 49 a (new)
Recital 49 a (new)
(49a) In order to contribute to a transparent online environment for consumers that supports the green transition, online platforms that allow consumers to conclude distant contracts with traders should provide consumers in real time with clear and unambiguous information on the environmental impact of its products and services, such as the use of sustainable and efficient delivery methods, sustainable and ecological packaging, as well as the environmental costs of returning goods in the event of withdrawal.
Amendment 450 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
(50a) In the light of effective enforcement of local rules to combat long-term rental housing shortages and to limit short-term holiday rentals, as was justified in the Cali Apartments case (cases C-724/18 and C-727/18), all natural or legal persons renting out short- term holiday rentals shall be subject to the obligations under Article 22 of this Regulation.
Amendment 453 #
Proposal for a regulation
Recital 52
Recital 52
Amendment 460 #
Proposal for a regulation
Recital 52 a (new)
Recital 52 a (new)
(52a) A core part of an online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, online platforms should ensure that recipients can understand how recommender system impact the way information is displayed, and can influence how information is presented to them. They should clearly present the parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters. Options not based on profiling of the recipient should be available and used by default.
Amendment 475 #
Proposal for a regulation
Recital 57
Recital 57
(57) ThreFive categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the intended use and misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products and illegally traded animals. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to technology design choices such as the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intended use of, malfunctioning of, as well as the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors or other vulnerable groups, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices, including undisclosed commercial communications published by recipients of the service that are not marketed, sold or arranged by the online platform. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. A fourth category concerns negative societal effects of technology design, value chain and business-model choices in relation to systemic risks that represent threats to democracy. A fifth category concerns environmental risks such as high electricity and water consumption, heat production and CO2 emissions related to the provision of the service and technical infrastructure or to user behaviour modification with a direct environmental impact, such as directing users to choose less sustainable options when it comes to delivery or packaging.
Amendment 489 #
Proposal for a regulation
Recital 60
Recital 60
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give thevetted auditors access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. AThis guarantee should not be a means to circumvent the applicability of audit obligations in this Regulation applicable to very large online platforms. Vetted auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
Amendment 491 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the BoardAgency without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the vetted auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the vetted auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
Amendment 493 #
Proposal for a regulation
Recital 61 a (new)
Recital 61 a (new)
(61a) In order to ensure a participative and inclusive approach and address societal concerns raised by the services of very large online platforms, it is necessary to set up a European Social Media Council at Union level. The transparency, inclusiveness and independence of the Council ensures that decisions on content moderation are shaped by a diverse range of expertise and perspectives. The Council should support the Agency and the Commission by issuing policy and implementation recommendations and help platforms improving and adjusting content moderation practices under terms and conditions. The Council should consist of independent experts, representatives of the recipients of the service, representatives of groups potentially impacted by their services, and civil society organisations. While not legally binding, the Councils’ recommendations will yield effective outcomes, incorporating a wider and more diverse range of inputs to societal challenges that very large online platforms may pose. Its strength and efficiency is based on voluntary compliance by platforms, whose commitment will be to respect and execute the Council’s recommendations in good faith. In order to function efficiently, the Council and its members should have sufficient human, material and financial resources at their disposal.
Amendment 494 #
Proposal for a regulation
Recital 62
Recital 62
Amendment 498 #
Proposal for a regulation
Recital 62 a (new)
Recital 62 a (new)
(62a) Recommender systems used by very large online platforms pose a particular risk in terms of consumer choice and lock-in effects. Consequently, in addition to the obligations applicable to all online platforms, very large online platforms should offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems, including through application programming interfaces.
Amendment 502 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the CommissionAgency may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers, civil society and media organisations on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the CommissionAgency and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, not-for-profit bodies, organisations or associations, or media organisations. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests of the platform and any other parties concerned, including tradhe srecrets and other confidential information, of the platform and any other parties concerned, including the recipients of the serviceipients of the service. To that end, the Commission should issue regulatory guidance to specify the modalities and safeguards for data access and sharing, and provide platforms with legal certainty while ensuring the independence of the research.
Amendment 517 #
Proposal for a regulation
Recital 67
Recital 67
(67) The Commission and the BoardAgency should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the BoardAgency, by participating in the same codes of conduct.
Amendment 532 #
Proposal for a regulation
Recital 71 a (new)
Recital 71 a (new)
(71a) In order to ensure that the systemic role of very large online platforms does not endanger the internal market by unfairly excluding innovative new entrants, including SMEs, entrepreneurs and start-ups, additional rules are needed to allow recipients of the service to switch or connect and interoperate between online platforms or internet ecosystems. Therefore, interoperability obligations should require very large online platforms to share appropriate tools, data, expertise, and resources. As part of those measures, the Commission should explore different technologies and open standards and protocols, including the possibility of technical interfaces (Application Programming Interface), that allow recipients of service or other market participants to access the key functionalities of very large online platforms to exchange information.
Amendment 533 #
Proposal for a regulation
Recital 72
Recital 72
(72) The task of ensuring adequate oversight and enforcement of the obligations laid down in this Regulation should in principle be attributed to the Member States, with exception of the oversight and enforcement of Chapter III Section 4 which shall lie with the Agency. To this end, they Member States should appoint at least one independent authority with the task to apply and enforce this Regulation. Member States should however be able to entrust more than one competent authority, with specific supervisory or enforcement tasks and competences concerning the application of this Regulation, for example for specific sectors, such as electronic communications’ regulators, media regulators or consumer protection authorities, reflecting their domestic constitutional, organisational and administrative structure.
Amendment 535 #
Proposal for a regulation
Recital 73
Recital 73
(73) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be identified as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, Agency, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure effective involvement of all relevant authorities in the supervision and enforcement at Union level.
Amendment 538 #
Proposal for a regulation
Recital 74
Recital 74
(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission and the Agency. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate.
Amendment 543 #
Proposal for a regulation
Recital 79
Recital 79
(79) In the course of the exercise of those powers, the competent authorities should comply with the applicable national rules regarding procedures and matters such as the need for a prior judicial authorisation to enter certain premises and legal professional privilege. Those provisions should in particular ensure respect for the fundamental rights to an effective remedy and to a fair trial, including the rights of defence, and, the right to respect for private life. In this regard, the guarantees provided for in relation to the proceedings of the CommissionAgency pursuant to this Regulation could serve as an appropriate point of reference. A prior, fair and impartial procedure should be guaranteed before taking any final decision, including the right to be heard of the persons concerned, and the right to have access to the file, while respecting confidentiality and professional and business secrecy, as well as the obligation to give meaningful reasons for the decisions. This should not preclude the taking of measures, however, in duly substantiated cases of urgency and subject to appropriate conditions and procedural arrangements. The exercise of powers should also be proportionate to, inter alia the nature and the overall actual or potential harm caused by the infringement or suspected infringement. The competent authorities should in principle take all relevant facts and circumstances of the case into account, including information gathered by competent authorities in other Member States.
Amendment 548 #
Proposal for a regulation
Recital 85
Recital 85
(85) Where a Digital Services Coordinator requests another Digital Services Coordinator to take action, the requesting Digital Services Coordinator, or the Board in case it issued a recommendation to assess issues involving more than three Member States, should be able to refer the matter to the CommissionAgency in case of any disagreement as to the assessments or the measures taken or proposed or a failure to adopt any measures. The CommissionAgency, on the basis of the information made available by the concerned authorities, should accordingly be able to request the competent Digital Services Coordinator to re-assess the matter and take the necessary measures to ensure compliance within a defined time period. This possibility is without prejudice to the Commission’s general duty to oversee the application of, and where necessary enforce, Union law under the control of the Court of Justice of the European Union in accordance with the Treaties. A failure by the Digital Services Coordinator of establishment to take any measures pursuant to such a request may also lead to the Commission’sAgency intervention under Section 3 of Chapter IV of this Regulation, where the suspected infringer is a very large online platformin accordance with Article 45 (5).
Amendment 553 #
Proposal for a regulation
Recital 86
Recital 86
(86) In order to facilitate cross-border supervision and investigations involving several Member States, the Digital Services Coordinators should be able to participate, on a permanent or temporary basis, in joint oversight and investigation activities concerning matters covered by this Regulation. Those activities may include other competent authorities and may cover a variety of issues, ranging from coordinated data gathering exercises to requests for information or inspections of premises, within the limits and scope of powers available to each participating authority. The BoardAgency may be requested to provide advice in relation to those activities, for example by proposing roadmaps and timelines for activities or proposing ad-hoc task-forces with participation of the authorities involved.
Amendment 555 #
Proposal for a regulation
Recital 87
Recital 87
(87) In view of the particular challenges that may emerge in relation to assessing and ensuring a very large online platform’s compliance, for instance relating to the scale or complexity of a suspected infringement or the need for particular expertise or capabilities at Union level, Digital Services Coordinators should have the possibility to request, on a voluntary basis, the CommissionAgency to intervene and exercise its investigatory and enforcement powers under this Regulation.
Amendment 556 #
Proposal for a regulation
Recital 88
Recital 88
Amendment 559 #
Proposal for a regulation
Recital 89
Recital 89
Amendment 562 #
Proposal for a regulation
Recital 90
Recital 90
Amendment 565 #
Proposal for a regulation
Recital 91
Recital 91
Amendment 569 #
Proposal for a regulation
Recital 92
Recital 92
Amendment 571 #
Proposal for a regulation
Recital 93
Recital 93
Amendment 572 #
Proposal for a regulation
Recital 95
Recital 95
(95) In order to address those public policy concerns it is therefore necessary to provide for a common approach to system of enhanced supervision and enforcement at Union level. Once an infringement of one of the provisions that solely apply to very large online platforms has been identified, for instance pursuant to individual or joint investigations, auditing or complaints, the Digital Services Coordinator of establishment, upon its own initiative or upon the Board’s advice,auditing or complaints, the Agency should monitor any subsequent measure taken by the very large online platform concerned as set out in its action plan. That Digital Services Coordinatore Agency should be able to ask, where appropriate, for an additional, specific audit to be carried out, on a voluntary basis, to establish whether those measures are sufficient to address the infringement. At the end of that procedure, it should inform the Board, the Commission and the platform concerned of its views on whether or not that platform addressed the infringement, specifying in particular the relevant conduct and its assessment of any measures taken. The Digital Services Coordinator should perform its role under this common system in a timely manner and taking utmost account of any opinions and other advice of the Board.
Amendment 573 #
Proposal for a regulation
Recital 96
Recital 96
(96) Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission may, on its own initiative or upon advice of the Board,the Agency may decide to further investigate the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the CommissionAgency should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also have such a possibility to intervene in cross- border situations where the Digital Services Coordinator of establishment did not take any measures despite the CommissionAgency’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the CommissionAgency to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform.
Amendment 575 #
Proposal for a regulation
Recital 97
Recital 97
Amendment 581 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 582 #
Proposal for a regulation
Recital 98
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the CommissionAgency should have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties.
Amendment 587 #
Proposal for a regulation
Recital 99
Recital 99
(99) In particular, the CommissionAgency should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The CommissionAgency should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the CommissionAgency should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The CommissionAgency should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The CommissionAgency should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’sAgency possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers.
Amendment 589 #
Proposal for a regulation
Recital 101
Recital 101
(101) The very large online platforms concerned and other persons subject to the exercise of the CommissionAgency’s powers whose interests may be affected by a decision should be given the opportunity of submitting their observations beforehand, and the decisions taken should be widely publicised. While ensuring the rights of defence of the parties concerned, in particular, the right of access to the file, it is essential that confidential information be protected. Furthermore, while respecting the confidentiality of the information, the CommissionAgency should ensure that any information relied on for the purpose of its decision is disclosed to an extent that allows the addressee of the decision to understand the facts and considerations that lead up to the decision.
Amendment 591 #
Proposal for a regulation
Recital 102
Recital 102
Amendment 593 #
Proposal for a regulation
Recital 103
Recital 103
Amendment 603 #
Proposal for a regulation
Article 1 – paragraph 1 – point b a (new)
Article 1 – paragraph 1 – point b a (new)
(ba) rules on transparency, accountability and respect for fundamental rights as regards the design and implementation of voluntary, self- and co-regulatory measures;
Amendment 609 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 619 #
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
Article 1 – paragraph 2 – point b a (new)
(ba) achieve a high level of consumer protection in the Digital Single Market.
Amendment 627 #
Proposal for a regulation
Article 1 – paragraph 5 – point b
Article 1 – paragraph 5 – point b
(b) Directive 20108/13/EC808;
Amendment 633 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
Article 1 – paragraph 5 – point c
(c) Union lawDirective(EU) 2019/790 on copyright and related rights; in the Digital Single Market
Amendment 637 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(ia) Directive(EU) 2019/882.
Amendment 648 #
Proposal for a regulation
Article 2 – paragraph 1 – point b
Article 2 – paragraph 1 – point b
(b) ‘recipient of the service’ means any natural or legal person who, for professional ends or otherwise, uses the relevant intermediary service for seeking information or making it accessible;
Amendment 668 #
Proposal for a regulation
Article 2 – paragraph 1 – point e
Article 2 – paragraph 1 – point e
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is actingoffering goods or services, including through any person acting in his or her name or on his or her behalf, for purposes directly relating to his or her trade, business, craft or profession;
Amendment 684 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘allegedly illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not isubject to allegations of non compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 691 #
Proposal for a regulation
Article 2 – paragraph 1 – point g a (new)
Article 2 – paragraph 1 – point g a (new)
(ga) ‘manifestly illegal content’ means any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State;
Amendment 712 #
Proposal for a regulation
Article 2 – paragraph 1 – point i
Article 2 – paragraph 1 – point i
(i) ‘dissemination to the public’ means making information availaccessible, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties;
Amendment 715 #
Proposal for a regulation
Article 2 – paragraph 1 – point k
Article 2 – paragraph 1 – point k
(k) ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications which enables recipients of the service to access and interact with the relevant intermediary service;
Amendment 721 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information;
Amendment 722 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, prioritise or rank in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 726 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
Article 2 – paragraph 1 – point p
Amendment 737 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) ‘dark pattern’ means an online interface or apart thereof that via its structure, design or functionality subverts or impairs the autonomy, decision- making, preferences or choice of recipients of the service.
Amendment 746 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Amendment 769 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
Amendment 783 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Amendment 797 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
No provision of this Regulation shall prevent providers of intermediary services from offering end-to-end encrypted services, or make the provision of such services a cause for liability or loss of immunity.
Amendment 802 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against aone or more specific items of illegal content, issued by the relevant national judicial or administrative authoritieauthority, or against an offer of illegal goods or services issued by the relevant national administrative or judicial authorities, through trusted and secure communication channels, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 808 #
Proposal for a regulation
Article 8 – paragraph 1 a (new)
Article 8 – paragraph 1 a (new)
1a. Individuals shall have the right to report allegedly illegal content or to mandate a body, organisation or association referred to in Article 68 to report such content to the competent authorities in their country of residence, which shall expeditiously make a ruling. Where the content is deemed illegal under the national law of the country of residence of the individual, or manifestly illegal under Union law, this shall be reported to the platform for immediate enforcement on the territory of the Member State issuing the order and to the competent authorities for assessment under national law.
Amendment 816 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
Article 8 – paragraph 2 – point a – indent 1
— a sufficiently detailed statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed;
Amendment 820 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2
Article 8 – paragraph 2 – point a – indent 2
— one or more exact uniform resource locatorsa clear indication of the exact electronic location of that information, such as the exact URL or URLs where appropriate and, where necessary, additional information enabling the identification of the illegal content concerned;
Amendment 825 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
Article 8 – paragraph 2 – point a – indent 3
— information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content, including deadlines for appeal;
Amendment 827 #
Proposal for a regulation
Article 8 – paragraph 2 – point b
Article 8 – paragraph 2 – point b
(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective and does not lead to the removal of content that is legal in another Member State;
Amendment 831 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
Article 8 – paragraph 2 – point b a (new)
(ba) the territorial scope of an order addressed to a provider that has its main establishment in another Member State is limited to the territory of the Member State issuing the order;
Amendment 832 #
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
Article 8 – paragraph 2 – point b b (new)
(bb) the territorial scope of an order addressed to a provider or its representative that has its main establishment outside the Union, where Union law is infringed, is limited to the territory of the Union or, where national law is infringed, to the territory of the Member State issuing the order;
Amendment 853 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
Article 8 – paragraph 4 a (new)
4a. Providers of intermediary services may refuse to execute an order referred to in paragraph 1 if it contains manifest errors or does not contain sufficient information as referred to in paragraph 2. Providers shall inform the competent authority without undue delay, asking for the necessary clarification. It may submit an appeal to the Digital Services Coordinator of establishment where it feels that the territorial scope of the order is disproportionate.
Amendment 855 #
Proposal for a regulation
Article 8 – paragraph 4 b (new)
Article 8 – paragraph 4 b (new)
4b. Member States shall ensure that the judicial authorities may, at the request of an applicant whose personality rights are infringed by illegal content, issue against the relevant provider of hosting services an order in accordance with this Article to remove or disable access to this content, including by way of an interlocutory injunction.
Amendment 860 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial orauthorities, or regarding offers of illegal goods or services issued by administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order via trusted and secure communications channels.
Amendment 869 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
Article 9 – paragraph 2 – point a – indent -1 (new)
— a clear indication of the exact electronic location, an account name or a unique identifier of the recipient on whom information is sought;
Amendment 874 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
Article 9 – paragraph 2 – point a – indent 2
— information about legal redress available to the provider and to the recipients of the service concerned including deadlines for appeal, and ensure that they can be exercised effectively;
Amendment 875 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 a (new)
Article 9 – paragraph 2 – point a – indent 2 a (new)
— whether the provider may swiftly inform the recipient of the service concerned.
Amendment 883 #
Proposal for a regulation
Article 9 – paragraph 2 a (new)
Article 9 – paragraph 2 a (new)
2a. The provider of the service shall inform the recipient of the service whose data is being sought without undue delay.
Amendment 889 #
Proposal for a regulation
Article 9 a (new)
Article 9 a (new)
Article 9a Effective remedies for consumers 1. Recipients of the service whose content was removed according to Article 8 or whose information was sought according to Article 9 shall have the right to effective remedies against such orders, without prejudice to remedies available under Directive (EU) 2016/680 and Regulation(EU) 2016/679. 2. Such right to an effective remedy shall be exercised before a court in the issuing Member State in accordance with national law and shall include the possibility to challenge the legality of the measure, including its necessity and proportionality. 3. Digital Services Coordinators shall publish a ‘toolbox’ of complaint and redress mechanisms applicable in their respective territory, in at least one of the official languages of the Member State where they operate.
Amendment 890 #
Proposal for a regulation
Article 9 b (new)
Article 9 b (new)
Article 9b Where the issuing authority is subject to a procedure under Article 7(1) or 7(2) of the Treaty on European Union, the provider of intermediary services shall act upon the order or transmit the requested data only after receiving the explicit written approval of the Digital Services Coordinator of establishment.
Amendment 895 #
Proposal for a regulation
Article 9 a (new)
Article 9 a (new)
Article 9a Exclusion for micro enterprises and not- for-profit services This Chapter shall not apply to online platforms that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC or as a not-for-profit service with fewer than 100,000 monthly active users.
Amendment 900 #
Proposal for a regulation
Article 10 – paragraph 1 a (new)
Article 10 – paragraph 1 a (new)
1a. Providers of intermediary services shall ensure that recipients of the service, including affected non-users, can communicate with them in a direct, accessible and timely manner and, as necessary, request non-automated responses.
Amendment 929 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions or modifications that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, user- friendly and unambiguous language and shall be publicly available in an easily accessible formatand machine-readable format in the languages in which the service is offered.
Amendment 936 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall publish summary versions of their terms and conditions in clear, user- friendly and unambiguous language, and in an easily accessible and machine- readable format. Such a summary shall include information on remedies and redress mechanisms pursuant to Articles 17 and 18, where available.
Amendment 938 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objectivecoherent, predictable, non- discriminatory, transparent, diligent, non- arbitrary and proportionate manner in applying and enforcing the restrictions and modifications referred to in paragraph 1, with due rin compliance with procedural safeguard tos and in full respect of the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter and relevant national law.
Amendment 955 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Any restriction referred to in paragraph 1 must respect the fundamental rights enshrined in the Charter and relevant national law.
Amendment 957 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2b. Individuals who are enforcing restrictions on the basis of terms and conditions of providers of intermediary services shall be given adequate initial and ongoing training on the applicable laws and international human rights standards, as well as on the action to be taken in case of conflict with the terms and conditions. Such individuals shall be provided with appropriate working conditions, including professional support, qualified psychological assistance and qualified legal advice, where relevant.
Amendment 960 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall notify the recipients of the service of any change to the contract terms and conditions that can affect their rights and provide a user-friendly explanation thereof. The changes shall not be implemented before the expiry of a notice period which is reasonable and proportionate to the nature and extent of the envisaged changes and to their consequences for the recipients of the service. That notice period shall be at least 15 days from the date on which the provider of intermediary services notifies the recipients about the changes. Failure to consent to such changes should not lead to basic services becoming unavailable.
Amendment 978 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish in a standardised and machine-readable format, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:
Amendment 987 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
Article 13 – paragraph 1 – point c
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, including removals, suspensions, demotions or the imposition of other sanctions, categorised by the type of reason and basis for taking those measures;, as well as measures taken to provide training and assistance to members of staff who are engaged in content moderation.
Amendment 990 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
Amendment 1001 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
Amendment 1014 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 1024 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means. These mechanisms shall be close to the content in question and located on the same level in the online interface as, and clearly distinguishable from, where applicable, mechanisms for notification of alleged violations of terms and conditions. The Commission shall adopt delegated acts in accordance with Article 69 to lay down specific requirements regarding the mechanisms referred to in paragraph 1.
Amendment 1034 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator canmay, in some cases, identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of valid notices containing all of the following elements:
Amendment 1041 #
Proposal for a regulation
Article 14 – paragraph 2 – point a a (new)
Article 14 – paragraph 2 – point a a (new)
(aa) evidence that substantiates the claim, where possible;
Amendment 1045 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the exact electronic location of that information, in particular the exact URL or URLssuch as the URL or URLs or other identifiers where appropriate, and, where necessary, additional information enabling the identification of the alleged illegal content;
Amendment 1051 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
Article 14 – paragraph 2 – point c
Amendment 1054 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
Amendment 1065 #
Proposal for a regulation
Article 14 – paragraph 4
Article 14 – paragraph 4
4. WThere the notice contains the name and an electronic mail address of the individual or entity that submitted it, individual or entity that submitted the notice shall be given the option to provide an electronic mail address to enable the provider of hosting services shallto promptly send a confirmation of receipt of the notice to that individual or entity.
Amendment 1066 #
Proposal for a regulation
Article 14 – paragraph 4 a (new)
Article 14 – paragraph 4 a (new)
4a. Where individuals decide to include their contact details in a notice, their anonymity towards the recipient of the service who provided the content shall be ensured, except in cases of alleged violations of personality rights or of intellectual property rights.
Amendment 1068 #
Proposal for a regulation
Article 14 – paragraph 5
Article 14 – paragraph 5
5. The provider shall also, without undue delay, notify that individual or entity of its decisaction in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision.
Amendment 1069 #
Proposal for a regulation
Article 14 – paragraph 5 a (new)
Article 14 – paragraph 5 a (new)
5a. The provider of intermediary services shall also notify the recipient of the service who provided the information, where contact details are available, giving them the opportunity to reply before taking a decision, unless this would obstruct the prevention and prosecution of serious criminal offences.
Amendment 1077 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objectiveact in a timely, diligent, non- discriminatory and non-arbitrary manner. Where they use automated means for that pre-processing notices or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 1085 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Upon receipt of a valid notice, providers of hosting services shall act expeditiously to disable access to content which is manifestly illegal.
Amendment 1086 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
Article 14 – paragraph 6 b (new)
6b. Information that has been the subject of a notice and that is not manifestly illegal shall remain accessible while the assessment of its legality is still pending. Member States shall ensure that providers of intermediary services are not held liable for failure to remove notified information, while the assessment of legality is still pending.
Amendment 1090 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
Article 14 – paragraph 6 c (new)
6c. A decision taken pursuant to a notice submitted in accordance with Article 14(1) shall protect the rights and legitimate interests of all affected parties, in particular their fundamental rights as enshrined in the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue.
Amendment 1091 #
Proposal for a regulation
Article 14 – paragraph 6 d (new)
Article 14 – paragraph 6 d (new)
6d. The provider of hosting services shall ensure that processing of notices is undertaken by qualified individuals to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant professional support, qualified psychological assistance and legal advice.
Amendment 1092 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or to disable access to, or to demote or otherwise impose sanctions against specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall promptly inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. of the action, provide a clear and specific statement of reasons for that action, and include information on the possibility to issue a counter- notice, to make use of the internal complaint-handling system set out in Article 17 and to appeal a decision with the competent authority. This obligation shall not apply and statements of reasons may be withheld where: (a) it is necessary for the investigation, or prosecution, of violations of law or public policy, including for ongoing criminal investigations, to justify avoiding or postponing notice to the recipient; or (b) the content removed were components of high-volume, commercial campaigns to deceive users or manipulate content moderation efforts.
Amendment 1101 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
Article 15 – paragraph 2 – point a
(a) whether the decisaction entails either the removal of, demotion or other sanction against, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of accessaction, including, where a decision was taken pursuant to Article 14, an explanation about why the disabling of access did not exceed what was strictly necessary to achieve its objective;
Amendment 1107 #
Proposal for a regulation
Article 15 – paragraph 2 – point b
Article 15 – paragraph 2 – point b
(b) the facts and circumstances relied on in taking the decisaction, including where relevant whether the decisaction was taken pursuant to a notice of manifestly illegal content submitted in accordance with Article 14 or to an order in accordance with Article 8;
Amendment 1109 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means in taking the decision, including whereinforming the decision was taken in respect of content detected or identified using automated means;
Amendment 1112 #
Proposal for a regulation
Article 15 – paragraph 2 – point d
Article 15 – paragraph 2 – point d
(d) where the decision concerns allegedmanifestly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground;
Amendment 1116 #
Proposal for a regulation
Article 15 – paragraph 2 – point f
Article 15 – paragraph 2 – point f
(f) clear, user-friendly information on the redress possibilities available to the recipient of the service in respect of the decision, in particular through internal complaint- handling mechanisms, out-of- court dispute settlement and judicial redress.
Amendment 1118 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible, machine-readable database managed and published by the Commission. That information shall not contain personal data.
Amendment 1135 #
Proposal for a regulation
Article 16
Article 16
Amendment 1153 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or, demote, disable access to or impose other sanctions against the information;
Amendment 1175 #
Proposal for a regulation
Article 17 – paragraph 2
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendlincluding for persons with disabilities, user-friendly, non- discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner, including for persons with disabilities.
Amendment 1180 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner, non-discriminatory and non- arbitrary manner and within seven days starting on the date on which the online platform received the complaint. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1, without undue delay.
Amendment 1191 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means and are reviewed by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards and to whom appropriate working conditions are provided, including, where relevant, professional support, qualified psychological assistance and legal advice.
Amendment 1199 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to innd organisations mandated under Article 17(1)68, shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article 17. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. Online platforms shall not be liable for implementing decisions of a dispute settlement procedure. The first subparagraph is without prejudice to the right of the recipient concerned to seek redress against the decision before a court in accordance with the applicable law.
Amendment 1210 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – introductory part
Article 18 – paragraph 2 – subparagraph 1 – introductory part
2. The Digital Services Coordinator of the Member State where the independent out-of-court dispute settlement body is established shall, at the request of that body, certify the body for a maximum of three years, which can be renewed, where the body has demonstrated that it meets all of the following conditions:
Amendment 1214 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and its members are remunerated in a way that is not linked to the outcome of the procedure;
Amendment 1216 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a a (new)
Article 18 – paragraph 2 – subparagraph 1 – point a a (new)
(aa) it is composed of legal experts;
Amendment 1218 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point b a (new)
Article 18 – paragraph 2 – subparagraph 1 – point b a (new)
(ba) the natural persons with responsibility for dispute settlement are granted a period of office of a minimum of three years to ensure the independence of their actions;
Amendment 1219 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point b b (new)
Article 18 – paragraph 2 – subparagraph 1 – point b b (new)
(bb) the natural persons with responsibility for dispute settlement commit not to work for the online platform or a professional organisation or business association of which the online platform is a member for a period of three years after their position in the body has ended;
Amendment 1220 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point b c (new)
Article 18 – paragraph 2 – subparagraph 1 – point b c (new)
(bc) natural persons with responsibility for dispute resolution may not have worked for an online platform or a professional organisation or business association of which the online platform is a member for a period of two years before taking up their position in the body;
Amendment 1227 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible including for persons with disabilities through electronic communication technology;
Amendment 1228 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c a (new)
Article 18 – paragraph 2 – subparagraph 1 – point c a (new)
(ca) the anonymity of the individuals involved in the settlement procedure can be guaranteed;
Amendment 1231 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
Article 18 – paragraph 2 – subparagraph 1 – point d
(d) it is capable ofensures the settling of disputes in a swift, efficient and cost-effective manner and in at least one official language of the Union, or at the request of the recipient at least in English;
Amendment 1237 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
Article 18 – paragraph 2 – subparagraph 1 – point e
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure which are easily and publicly accessible.
Amendment 1239 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e a (new)
Article 18 – paragraph 2 – subparagraph 1 – point e a (new)
(ea) it ensures that a preliminary decision is taken within a period of seven days following the reception of the complaint and that the outcome of the dispute settlement is made available within a period of 90 calendar days from the date on which the body has received the complete complaint file.
Amendment 1247 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
Article 18 – paragraph 3 – subparagraph 1
If the body decides the dispute in favour of the recipient of the service or organisations mandated under Article 68, the online platform shall reimburse the recipient or organisation for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient or organisation shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
Amendment 1252 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6a. By 31 December 2024, and every two years thereafter, Digital Services Coordinators shall assess whether the dispute settlement bodies that they have been certified in accordance with paragraph 2 comply with the requirements of this Regulation. Each Digital Services Coordinator shall publish and send to the Agency a report on the development and functioning of those bodies. That report shall in particular: (a) identify best practices of the out- of-court dispute settlement bodies; (b) report on any demonstrable shortcomings, supported by statistics, that hinder the functioning of the out-of-court dispute settlement bodies for both domestic and cross-border disputes, where appropriate; (c) make recommendations on how to improve the effective and efficient functioning of the out-of-court dispute settlement bodies, where appropriate.
Amendment 1260 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delayexpeditiously, taking into account due process. The use of automated notices by trusted flaggers without effective human review shall not be accepted as a valid means of submission.
Amendment 1267 #
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
Article 19 – paragraph 2 – introductory part
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entitiesy, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:
Amendment 1271 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying allegedly illegal content;
Amendment 1275 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform, law enforcement or governmental entity;
Amendment 1280 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent, accurate and objective manner.
Amendment 1282 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
Article 19 – paragraph 2 – point c a (new)
(ca) it publishes, at least once a year, clear, easily comprehensible and detailed reports on all notices submitted in accordance with Article 14 during the relevant period. The report shall list: - notices categorised by the identity of the provider of hosting services; - the type of content notified; - the specific legal provisions allegedly breached by the content notified; - the action taken by the provider; - any potential conflicts of interest and sources of funding, and an explanation of the procedures in place to ensure the trusted flagger maintains its independence.
Amendment 1295 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commissaward the trusted flagger status for periods of three years, upon which the status may be renewed where the trusted flagger concerned continues to meet the requirements of this Regulation, and the Boardshall communicate to the Agency the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2.
Amendment 1299 #
Proposal for a regulation
Article 19 – paragraph 4
Article 19 – paragraph 4
4. The Commission shall publish the information referred to in paragraph 3s3 and 6 in a publicly available database in an easily accessible and machine-readable format and keep the database updated.
Amendment 1304 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a not insignificant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 1310 #
Proposal for a regulation
Article 19 – paragraph 6
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received byfrom third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
Amendment 1315 #
Proposal for a regulation
Article 19 – paragraph 7
Article 19 – paragraph 7
7. The Commission, after consulting the Board,Agency may issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 5 and 6.
Amendment 1318 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. Any prior warning shall provide the recipient of the service with a reasonable amount of time to provide a justification to the online platform to consider that the information to which the suspension relates is not manifestly illegal. Such justifications shall be subject to human review.
Amendment 1328 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and, internal complaints- handling systems and out-of-court dispute settlement bodies referred to in Articles 14, 17 and 178, respectively, by individuals or entities or by complainants that frequentpeatedly submit notices or complaints or initiate dispute settlements that are manifestly unfounded.
Amendment 1337 #
Proposal for a regulation
Article 20 – paragraph 3 – point c
Article 20 – paragraph 3 – point c
(c) the gravity of the misuses and its consequences, in particular on the exercise of fundamental rights, regardless of the absolute numbers or relative proportion;
Amendment 1342 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
Article 20 – paragraph 3 – point d a (new)
(da) the fact that notices and complaints were submitted following the use of an automated content recognition system;
Amendment 1343 #
Proposal for a regulation
Article 20 – paragraph 3 – point d b (new)
Article 20 – paragraph 3 – point d b (new)
(db) any justification provided by the recipient of the service to provide sufficient grounds to consider that the information is not manifestly illegal.
Amendment 1347 #
Proposal for a regulation
Article 20 – paragraph 4
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner with due regard to their obligations under Article 12(2) in particular as regards the applicable fundamental rights of the recipients of the service as enshrined in the Charter, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
Amendment 1370 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained, and has made best efforts to verify the completeness and reliability of, the following information:
Amendment 1410 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources.
Amendment 1421 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2
Article 22 – paragraph 3 – subparagraph 2
Where the trader fails to correct or complete that information swiftly, the online platform shall suspend the provision of its service to the trader until the request is complied with.
Amendment 1445 #
Proposal for a regulation
Article 22 – paragraph 6
Article 22 – paragraph 6
6. The online platform shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the servicpublicly available, in a clear, easily accessible and comprehensible manner.
Amendment 1453 #
Proposal for a regulation
Article 22 – paragraph 7 a (new)
Article 22 – paragraph 7 a (new)
7a. Online platforms facilitating short- term holiday rentals must obtain registration numbers, licencing numbers or an equivalent if such a number is required for the offering of short-term holiday rentals by EU, national or local law and must publish this number in the offer.
Amendment 1464 #
Proposal for a regulation
Article 22 a (new)
Article 22 a (new)
Article 22a Transparency for sustainable consumption Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that it provides consumers in a clear and unambiguous manner and in real time with information on the environmental impact of its products and services, such as the use of sustainable and efficient delivery methods, sustainable and ecological packaging, as well as the environmental costs of returning goods in the event of withdrawal.
Amendment 1467 #
Proposal for a regulation
Article 23 – paragraph 1 – point a a (new)
Article 23 – paragraph 1 – point a a (new)
(aa) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed;
Amendment 1468 #
Proposal for a regulation
Article 23 – paragraph 1 – point a b (new)
Article 23 – paragraph 1 – point a b (new)
(ab) a list of all trusted flaggers and their area of expertise;
Amendment 1471 #
Proposal for a regulation
Article 23 – paragraph 1 – point c
Article 23 – paragraph 1 – point c
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied, including human review.
Amendment 1484 #
Proposal for a regulation
Article 24 – paragraph 1
Article 24 – paragraph 1
Amendment 1518 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Amendment 1521 #
Proposal for a regulation
Article 24 b (new)
Article 24 b (new)
Article 24b Additional obligations for platforms primarily used for the dissemination of user-generated pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure (a) that users who disseminate content have verified themselves through a double opt-in e-mail and cell phone registration; (b) professional human content moderation in line with Article 14 paragraph 6 d (new) and trained to identify image-based sexual abuse, where content having a high probability of being illegal; (c) the accessibility of a qualified notification procedure in the form that additionally to the mechanism referred to in Article14 and respecting the same principles with the exception of paragraph 5 a(new), individuals may notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity; content notified through this procedure is to be considered manifestly illegal in terms of Article 14 paragraph 6 a (new) and to be suspended without undue delay and at latest within 48 hours.
Amendment 1543 #
Proposal for a regulation
Article 25 a (new)
Article 25 a (new)
Article 25a Legal representatives of very large online platforms Very large online platforms shall establish one point of contact in each Member State and ensure that it is accessible for recipients of the service in at least one of the official languages of that Member State.
Amendment 1549 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the design, functioning and use made of their services in the Union. This risk assessment shall be specific to their services and activities, including technology design, value chain and business-model choices, and shall include the following systemic risks:
Amendment 1572 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects forforeseeable impact on the exercise of the fundamental rights, in particular the rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectivelythe Charter;
Amendment 1574 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) the intended use, any malfunctioning or intentional manipulation of their service, including by means of inauthentic usecommercial communications published on the platform that are not marketed, sold or arranged by the platform or automated exploitation of the service, in particular with an actual or foreseeable negative effeimpact on the protection of public health, minors and other categories of vulnerable groups of recipients of the service, civic discourse, or actual or foreseeable effeimpacts related to electoral processes and public security.;
Amendment 1581 #
Proposal for a regulation
Article 26 – paragraph 1 – point c a (new)
Article 26 – paragraph 1 – point c a (new)
(ca) any foreseeable negative societal effect of technology design or business- model choices in relation to systemic risks that represent threats to democracy;
Amendment 1582 #
Proposal for a regulation
Article 26 – paragraph 1 – point c b (new)
Article 26 – paragraph 1 – point c b (new)
(cb) any environmental impact such as electricity and water consumption, heat production and CO2 emissions related to the provision of the service and technical infrastructure or to consumer behaviour modification with a direct environmental impact.
Amendment 1585 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting, targeting and displaying advertisement as well as the underlying data collection, processing and profiling influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of informationcontent that is incompatible with their terms and conditions.
Amendment 1595 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
Article 26 – paragraph 2 a (new)
2a. The outcome of the risk assessment and supporting documents shall be communicated to the Agency and the Digital Services Coordinator of establishment. A summary version of the risk assessment shall be made publicly available in an easily accessible format.
Amendment 1597 #
Proposal for a regulation
Article 26 – paragraph 2 b (new)
Article 26 – paragraph 2 b (new)
2b. Organisations mandated under Article 68 shall have the right to obtain access to the outcome and supporting documents of a risk assessment and to lodge a complaint against its accuracy or completeness with the Digital Services Coordinator of establishment.
Amendment 1605 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonabletransparent, proportionate and effective mitigation measures, tailored to o eliminate, prevent and mitigate the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
Amendment 1608 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, the design, features or functioning of their services, their advertising model or their terms and conditions;
Amendment 1615 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display and targeting of advertisements in association with the service they provide;
Amendment 1616 #
Proposal for a regulation
Article 27 – paragraph 1 – point c
Article 27 – paragraph 1 – point c
(c) reinforcing the internal processes, testing, documentation or supervision of any of their activities in particular as regards detection of systemic risk;
Amendment 1622 #
Proposal for a regulation
Article 27 – paragraph 1 – point e a (new)
Article 27 – paragraph 1 – point e a (new)
(ea) targeted measures aimed at reducing electricity and water consumption, heat production and CO2 emissions related to the provision of the service and technical infrastructure.
Amendment 1624 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1a. Any measure adopted shall respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 1632 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission,Agency shall publish comprehensive reports, once a year, which shall include the following:
Amendment 1634 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Articles 30, 31 and 33;
Amendment 1643 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators,Agency may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the CommissionAgency shall organise public consultations.
Amendment 1651 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, and additionally where requested by the Agency, to independento audits to assess compliance with the following:
Amendment 1656 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; . Audits shall at least be performed on: (i) the clarity, coherence and predictable enforcement of terms of service with particular regard to the applicable fundamental rights as enshrined in the Charter; (ii) the completeness, methodology and consistency of the transparency reporting obligations as set out in Articles 13, 13a, 23, and 30 as well as respect for industry standards on transparency reporting; (iii) accuracy, predictability and clarity of the provider's follow-up for recipients of the service and notice providers to notices of manifestly illegal content and terms of service violations and the accuracy of classification (illegal or terms and conditions violation) of removed information; (iv) internal and third-party complaint handling mechanisms; (v) interaction with trusted flaggers and independent assessment of accuracy, response times, efficiency and whether there are indications of abuse; (vi) diligence with regard to verification of the traceability of traders; (vii) the adequateness and correctness of the risk assessment as set out in Article 26; (viii) the adequateness and effectiveness of the measures taken according to Article 27 to address the risks identified in the risk assessments as set out in Article 26; (ix) the effectiveness of and compliance with codes of conduct. Audits on the subjects mentioned in points (i) to (vii) may be combined where the organisation performing the audits has subject-specific expertise on the subject matters at hand.
Amendment 1666 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are legally and financially independent from the very large online platform concerned;
Amendment 1667 #
Proposal for a regulation
Article 28 – paragraph 2 – point b
Article 28 – paragraph 2 – point b
Amendment 1669 #
Proposal for a regulation
Article 28 – paragraph 2 – point c
Article 28 – paragraph 2 – point c
(c) have proven objectivitybeen recognised and vetted by the Agency on the basis of their proven objectivity, subject-specific expertise and professional ethics, based in particular on adherence to codes of practice or appropriate standards.
Amendment 1671 #
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
Article 28 – paragraph 2 – point c a (new)
(ca) natural persons performing the audits commit not to work for the very large online platform audited or a professional organisation or business association of which the platform is a member for a period of three years after their position in the auditing organisation has ended.
Amendment 1673 #
Proposal for a regulation
Article 28 – paragraph 3 – introductory part
Article 28 – paragraph 3 – introductory part
3. The organisations that perform the audits shall establish an audit report for each audit subject as referred to in point (a) of paragraph 1. The report shall be in writing and include at least the following:
Amendment 1674 #
Proposal for a regulation
Article 28 – paragraph 3 – point b a (new)
Article 28 – paragraph 3 – point b a (new)
(ba) a declaration of interests;
Amendment 1675 #
Proposal for a regulation
Article 28 – paragraph 3 – point d
Article 28 – paragraph 3 – point d
(d) a description of the main findings drawn from the audit and a summary of the main findings;
Amendment 1682 #
Proposal for a regulation
Article 28 – paragraph 4
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
Amendment 1683 #
Proposal for a regulation
Article 28 – paragraph 4 a (new)
Article 28 – paragraph 4 a (new)
4a. The Agency shall decide on the subject matter of audits to be performed and choose the auditing organisation for the relevant audited subject matter as referred to in paragraph 1. Yearly audits of very large online platforms may not be performed by the same auditing organisation for more than three consecutive times. The Agency shall monitor the implementation by the very large platforms of any operational recommendations addressed to them. The Agency shall publish and regularly update a list of vetted organisations that perform audits of very large online platforms. The Agency shall publish and regularly review detailed criteria such organisations need to meet in order to be vetted.
Amendment 1686 #
Proposal for a regulation
Article 28 a (new)
Article 28 a (new)
Amendment 1688 #
Proposal for a regulation
Article 29 – title
Article 29 – title
Recommender systems of very large online platforms
Amendment 1689 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
Amendment 1703 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2a. In addition to the obligations applicable to all online platforms, very large online platforms shall offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems.
Amendment 1705 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
Article 29 – paragraph 2 b (new)
2b. Very large online platforms may only limit access to third-party recommender systems temporarily and in exceptional circumstances, when justified by an obligation under Article 18 of Directive (EU) 2020/0359 and Article 32(1)(c) of Regulation (EU) 2016/679. Such limitations shall be notified within 24 hours to affected third parties and to the Agency. The Agency may require such limitations to be removed or modified where it decides by majority vote they are unnecessary or disproportionate.
Amendment 1706 #
Proposal for a regulation
Article 29 – paragraph 2 c (new)
Article 29 – paragraph 2 c (new)
2c. Very large online platforms shall not make commercial use of any of the data that is generated or received from third parties as a result of interoperability activities for purposes other than enabling those activities. Any processing of personal data related to those activities shall comply with Regulation (EU) 2016/679, in particular Articles 6(1)(a) and 5(1)(c).
Amendment 1717 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available in an easily accessible and comprehensible format and through application programming interfaces a repository containing the information referred to in paragraph 2, until oneseven years after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1727 #
Proposal for a regulation
Article 30 – paragraph 2 – point d
Article 30 – paragraph 2 – point d
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the mainall parameters used for that purpose including any parameters used to exclude particular groups;
Amendment 1729 #
Proposal for a regulation
Article 30 – paragraph 2 – point d a (new)
Article 30 – paragraph 2 – point d a (new)
(da) where it is disclosed, a copy of the content of commercial communications published on the very large online platforms that are not marketed, sold or arranged by the very large online platform, which have through appropriate channels been declared as such to the very large online platform;
Amendment 1731 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached in terms of impressions and engagements of the advertisement and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.
Amendment 1734 #
Proposal for a regulation
Article 30 – paragraph 2 – point e a (new)
Article 30 – paragraph 2 – point e a (new)
(ea) in case of an advertisement removed on the basis of a notice submitted in accordance with Article 14 or an order as set out in Article 8, the information referred to in points (b) to (d) of paragraph 2;
Amendment 1741 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. The online platform shall make reasonable efforts to ensure that the information referred to in paragraph 2 is accurate and complete.
Amendment 1753 #
Proposal for a regulation
Article 31 – paragraph 1
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon theirr an independent enforcement and monitoring unit of the Agency, upon reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
Amendment 1754 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from at least three Digital Services Coordinators of destination, the Digital Services Coordinator of establishment or the CommissionAgency, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers, vetted not-for-profit bodies, organisations or associations or vetted media organisations who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification, mitigation and understanding of systemic risks as set out in Article 26(1) and Article 27(1).
Amendment 1762 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interestindependent from commercial interests, not receive any funding by any of the very large online platforms as defined in Article 25 and disclose all funding sources, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. In order to be vetted, not-for-profit bodies, organisations or associations have to meet the requirements laid down in Article 68, have statutory objectives which are in the public interest, and have expertise related to the fields referred to in Article 26.
Amendment 1769 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the BoardAgency, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers, or not-for-profit bodies, organisations or associations or media organisations can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1777 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of followingit does not have access two reasons:the data.
Amendment 1779 #
Proposal for a regulation
Article 31 – paragraph 6 – point a
Article 31 – paragraph 6 – point a
Amendment 1780 #
Proposal for a regulation
Article 31 – paragraph 6 – point b
Article 31 – paragraph 6 – point b
Amendment 1786 #
Proposal for a regulation
Article 31 – paragraph 7 a (new)
Article 31 – paragraph 7 a (new)
Amendment 1790 #
Proposal for a regulation
Article 31 – paragraph 7 b (new)
Article 31 – paragraph 7 b (new)
7b. The Commission shall issue regulatory guidance for very large online platforms and consult with the European Data Protection Board to facilitate the drafting and implementation of codes of conduct at Union level between very large online platforms and vetted researchers, not-for-profit bodies, organisations or associations or media organisation to appropriate technical and organisational safeguards to be implemented before data can be shared pursuant to paragraphs 1 and 2.
Amendment 1791 #
Proposal for a regulation
Article 31 – paragraph 7 c (new)
Article 31 – paragraph 7 c (new)
7c. Upon completion of the research envisaged in Article 31(2), the vetted researchers, not-for-profit bodies, organisations or associations or media organisations, shall make their research publicly available, while fully respecting the rights and interests of the recipients of the service concerned in compliance with Regulation (EU) 2016/679.
Amendment 1797 #
Proposal for a regulation
Article 33 – paragraph 1
Article 33 – paragraph 1
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months in a standardised, machine-readable and easily accessible format.
Amendment 1801 #
Proposal for a regulation
Article 33 – paragraph 2 – point d a (new)
Article 33 – paragraph 2 – point d a (new)
(da) aggregate numbers for the total views and view rate of content prior to a removal on the basis of orders issued in accordance with Article 8 or content moderation engaged in at the provider’s own initiative and under its terms and conditions.
Amendment 1806 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Amendment 1813 #
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
Article 34 – paragraph 1 – introductory part
1. TWhere necessary to achieve agreed and clearly defined public objectives, the Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies at least for the following:
Amendment 1816 #
Proposal for a regulation
Article 34 – paragraph 1 – point a
Article 34 – paragraph 1 – point a
(a) electronic submission of notices under Article 14 in a manner that permits the logging and, where possible, the automatic publication of all relevant statistical data;
Amendment 1819 #
Proposal for a regulation
Article 34 – paragraph 1 – point b
Article 34 – paragraph 1 – point b
(b) electronic submission of notices by trusted flaggers under Article 19, including, if necessary, through application programming interfaces, and which permit the logging and, where possible, the automatic publication of all relevant statistical data;
Amendment 1820 #
Proposal for a regulation
Article 34 – paragraph 1 – point b a (new)
Article 34 – paragraph 1 – point b a (new)
(ba) terms and criteria for the submission of notices in a diligent manner by trusted flaggers under Article 19;
Amendment 1821 #
Proposal for a regulation
Article 34 – paragraph 1 – point c
Article 34 – paragraph 1 – point c
(c) specific interfaces, including application programming interfaces or other mechanisms, to facilitate compliance with the obligations set out in Articles 30 and 31;
Amendment 1830 #
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
Article 34 – paragraph 1 – point f a (new)
(fa) transparency reporting obligations pursuant to Article 13;
Amendment 1832 #
Proposal for a regulation
Article 34 – paragraph 1 – point f b (new)
Article 34 – paragraph 1 – point f b (new)
(fb) the design of online interfaces regarding inter alia the acceptance of and changes to terms and conditions, settings, advertising practices, recommender systems, and decisions within the content moderation process to prevent dark patterns;
Amendment 1833 #
Proposal for a regulation
Article 34 – paragraph 1 – point f c (new)
Article 34 – paragraph 1 – point f c (new)
(fc) electricity, water and heat consumption, including such consumption caused by artificial intelligence and recommender systems by very large online platforms;
Amendment 1834 #
Proposal for a regulation
Article 34 – paragraph 1 – point f d (new)
Article 34 – paragraph 1 – point f d (new)
(fd) data sufficiency, aiming at the reduction of data generation, in particular traffic data, including the reduction of associated electricity, water and heat consumption and resources from data centres.
Amendment 1841 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
2a. At least with regard to points (a), (b) and (ba new) of paragraph 1, the Commission shall carry out thorough impact assessments before implementation in order to ensure compliance with Union law. In particular, such mechanisms shall not lead to restrictions being automatically imposed on notified content.
Amendment 1845 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the BoardAgency shall encourage and facilitate the drawfting upand implementation of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges ofand responsibilities involved in comprehensively tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. Particular attention shall be given to avoiding counterproductive effects on competition, data access and security, the general monitoring prohibition and the rights of individuals. The Commission and the Agency shall approve and be party to any such code of conduct, in order to ensure adequate accountability and legal redress for individuals.
Amendment 1859 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1862 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
Amendment 1869 #
Proposal for a regulation
Article 35 – paragraph 4
Article 35 – paragraph 4
4. The Commission and the BoardAgency shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate, at least once a year, the achievement of their objectives and include at least the following points: (a) the evolution of the scale and nature of the public policy problem being addressed by the relevant code. (b) the existence or emergence of commercial interests on the part of the online platform that may disincentivise the successful implementation of the code; (c) whether there are adequate safeguards to ensure the rights of individuals and businesses. They shall publish their conclusions.
Amendment 1875 #
Proposal for a regulation
Article 35 – paragraph 5
Article 35 – paragraph 5
5. The BoardAgency shall regularly monitor and evaluate, at least once a year, the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.
Amendment 1877 #
Proposal for a regulation
Article 35 – paragraph 5 a (new)
Article 35 – paragraph 5 a (new)
5a. For each Code of Conduct a European Citizens’ Assembly is established that monitors outcomes of the Codes of Conduct, discusses the main issues at stake publicly and sets out public policy recommendations to the Commission. The members of the European Citizens’ Assemblies shall be randomly selected so as to be broadly representative of European society elected taking into account gender, age, location, and social class.
Amendment 1885 #
Proposal for a regulation
Article 36 – paragraph 2 – point a
Article 36 – paragraph 2 – point a
(a) the transmission of information held by providers of online advertising intermediaries to recipients of the service with regard to requirements set in points (b) and (c) of Article 24a new;
Amendment 1886 #
Proposal for a regulation
Article 36 – paragraph 2 – point b
Article 36 – paragraph 2 – point b
(b) the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30, in particular the information referred to in points (d) and (d a new) of paragraph 2 of Article 30..
Amendment 1906 #
Proposal for a regulation
Article 38 – paragraph 1
Article 38 – paragraph 1
1. Member States shall designate one or more competent authorities as responsible for the application and enforcement of this Regulation (‘competent authorities’), without prejudice to the procedures for the supervision of very large online platforms laid out in Section 3.
Amendment 1911 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 2
Article 38 – paragraph 2 – subparagraph 2
For that purpose, Digital Services Coordinators shall cooperate with each other, other national competent authorities, the Board and the CommissionAgency, without prejudice to the possibility for Member States to provide for regular exchanges of views with other authorities where relevant for the performance of the tasks of those other authorities and of the Digital Services Coordinator.
Amendment 1912 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 3
Article 38 – paragraph 2 – subparagraph 3
Where a Member State designates more than one competent authority in addition to the Digital Services Coordinator, it shall ensure that the respective tasks of those authorities and of the Digital Services Coordinator are clearly defined and that they cooperate closely and effectively when performing their tasks. The Member State concerned shall communicate the name of the other competent authorities as well as their respective tasks to the Commission and the BoardAgency.
Amendment 1933 #
Proposal for a regulation
Article 40 – paragraph 3
Article 40 – paragraph 3
3. Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of Chapters III and IV. Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member StateDigital Services Coordinators and ensure that the principle of ne bis in idem is respected.
Amendment 1938 #
Proposal for a regulation
Article 40 – paragraph 4
Article 40 – paragraph 4
4. Paragraphs 1, 2 and 3 are without prejudice to the second subparagraph of Article 50(4) and the second subparagraph of Article 51(2) and the tasks and powers of the Commission underprocedures for the supervision of very large online platforms as laid out in Section 3.
Amendment 1942 #
Proposal for a regulation
Article 41 – paragraph 1 – point a
Article 41 – paragraph 1 – point a
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time periodout undue delay, or at the latest within one month;
Amendment 1947 #
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 2
Article 41 – paragraph 2 – subparagraph 2
As regards points (c) and (d) of the first subparagraph, Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after having provided those others persons in good time with all relevant information relating to such orders, including the applicable time period, the fines or periodic payments that may be imposed for failure to comply and redress possibilities.
Amendment 1948 #
Proposal for a regulation
Article 41 – paragraph 3 – subparagraph 1 – introductory part
Article 41 – paragraph 3 – subparagraph 1 – introductory part
Amendment 1949 #
Proposal for a regulation
Article 41 – paragraph 3 – subparagraph 1 – point a
Article 41 – paragraph 3 – subparagraph 1 – point a
(a) require the management body of the providers, within a reasonable time period which shall in any case not exceed three months, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken;
Amendment 1951 #
Proposal for a regulation
Article 41 – paragraph 3 – subparagraph 1 – point b
Article 41 – paragraph 3 – subparagraph 1 – point b
(b) where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists or is continuously repeated and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.
Amendment 1953 #
Proposal for a regulation
Article 41 – paragraph 3 – subparagraph 2
Article 41 – paragraph 3 – subparagraph 2
The Digital Services Coordinator shall, except where it acts upon the Commission’s request referred to in Article 65, prior to submitting the request referred to in point (b) of the first subparagraph, invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing the measures that it intends to request and identifying the intended addressee or addressees thereof. The provider, the intended addressee or addressees and any other third party demonstrating a legitimate interest shall be entitled to participate in the proceedings before the competent judicial authority. Any measure ordered shall be proportionate to the nature, gravity, recurrence and duration of the infringement, without unduly restricting access to lawful information by recipients of the service concerned.
Amendment 1957 #
Proposal for a regulation
Article 42 – paragraph 3
Article 42 – paragraph 3
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 10% of the annual worldwide income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 12% of the annual worldwide income or turnover of the provider concerned.
Amendment 1960 #
Proposal for a regulation
Article 42 – paragraph 4
Article 42 – paragraph 4
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 510 % of the average daily worldwide turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.
Amendment 1965 #
Proposal for a regulation
Article 43 – paragraph 1
Article 43 – paragraph 1
Recipients of the service, as well as bodies, organisations or associations referred to in Article 68, independently of a recipient’s mandate, shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment without undue delay. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority without undue delay. Where the complaint falls under the responsibility of the Agency, the Digital Service Coordinator receiving the complaint shall transmit it to the Agency without undue delay.
Amendment 1971 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Article 43 – paragraph 1 a (new)
Recipients of the service or their representatives that lodged the complaint should have a right to be heard in the procedure conducted by the competent authority and should be informed about each stage of the procedure by the Digital Services Coordinator assessing their claim. They shall obtain a response from the Digital Coordinator within three months since they lodged their complaint.
Amendment 1973 #
Proposal for a regulation
Article 43 – paragraph 1 b (new)
Article 43 – paragraph 1 b (new)
A decision on the complaint shall be taken without delay and within 6 months at the latest.
Amendment 1975 #
Proposal for a regulation
Article 44 – paragraph 1
Article 44 – paragraph 1
1. Digital Services Coordinators shall draw up an clear and detailed annual report on their activities under this Regulation. They shall make the annual reports available to the public in a standardised and machine-readable format, and shall communicate them to the Commission and to the BoardAgency.
Amendment 1979 #
Proposal for a regulation
Article 44 – paragraph 2 – point b a (new)
Article 44 – paragraph 2 – point b a (new)
(ba) the number of appeals made against those orders raised by providers of intermediary services or recipients of the service as well as the outcome of appeals;
Amendment 1980 #
Proposal for a regulation
Article 44 – paragraph 2 – point b b (new)
Article 44 – paragraph 2 – point b b (new)
(bb) in the case of criminal law violations, the number of orders which led to investigation and prosecution of the underlying offences.
Amendment 1983 #
Proposal for a regulation
Article 45 – paragraph 1 – subparagraph 1
Article 45 – paragraph 1 – subparagraph 1
Where a Digital Services Coordinator has reasons to suspect that a provider of an intermediary service, not under the jurisdiction of the Member State concerned and not falling under the procedures laid out in Section 3, infringed this Regulation, it shall request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
Amendment 1984 #
Proposal for a regulation
Article 45 – paragraph 1 – subparagraph 2
Article 45 – paragraph 1 – subparagraph 2
Where the BoardAgency has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may recommend the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
Amendment 1991 #
Proposal for a regulation
Article 45 – paragraph 2 – point b
Article 45 – paragraph 2 – point b
(b) a description of the relevant facts, the provisions of this Regulation concerned and the reasons why the Digital Services Coordinator that sent the request, or the BoardAgency, suspects that the provider infringed this Regulation;
Amendment 1992 #
Proposal for a regulation
Article 45 – paragraph 2 – point c
Article 45 – paragraph 2 – point c
(c) any other information that the Digital Services Coordinator that sent the request, or the BoardAgency, considers relevant, including, where appropriate, information gathered on its own initiative or suggestions for specific investigatory or enforcement measures to be taken, including interim measures.
Amendment 1997 #
Proposal for a regulation
Article 45 – paragraph 3
Article 45 – paragraph 3
3. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, or the BoardAgency, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided.
Amendment 1999 #
Proposal for a regulation
Article 45 – paragraph 4
Article 45 – paragraph 4
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the BoardAgency, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation.
Amendment 2003 #
Proposal for a regulation
Article 45 – paragraph 5
Article 45 – paragraph 5
5. Where the Digital Services Coordinator that sent the request, or, where appropriate, the BoardAgency, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the CommissionAgency, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4.
Amendment 2007 #
Proposal for a regulation
Article 45 – paragraph 6
Article 45 – paragraph 6
6. The CommissionAgency shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Board.
Amendment 2012 #
Proposal for a regulation
Article 45 – paragraph 7
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the CommissionAgency concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request.
Amendment 2016 #
Proposal for a regulation
Article 46 – title
Article 46 – title
Joint investigations and requests for CommissionAgency intervention
Amendment 2018 #
Proposal for a regulation
Article 46 – paragraph 1 – subparagraph 1
Article 46 – paragraph 1 – subparagraph 1
Digital Services Coordinators may participate in joint investigations, which may be coordinated with the support of the BoardAgency, with regard to matters covered by this Regulation, concerning providers of intermediary services operating in several Member States.
Amendment 2019 #
Proposal for a regulation
Article 46 – paragraph 1 – subparagraph 2
Article 46 – paragraph 1 – subparagraph 2
Such joint investigations are without prejudice to the tasks and powers of the participating Digital Services Coordinators and the requirements applicable to the performance of those tasks and exercise of those powers provided in this Regulation. The participating Digital Services Coordinators shall make the results of the joint investigations available to other Digital Services Coordinators, the Commission and the BoardAgency through the system provided for in Article 67 for the fulfilment of their respective tasks under this Regulation.
Amendment 2025 #
Proposal for a regulation
Article 46 – paragraph 2
Article 46 – paragraph 2
2. Where a Digital Services Coordinator of establishment has reasons to suspect that a very large online platform infringed this Regulation, it may request the CommissionAgency to take the necessary investigatory and enforcement measures to ensure compliance with this Regulation in accordance with Section 3. Such a request shall contain all information listed in Article 45(2) and set out the reasons for requesting the CommissionAgency to intervene.
Amendment 2026 #
Proposal for a regulation
Article 47 – title
Article 47 – title
European Board for Digital ServicesPlatform Agency
Amendment 2027 #
Proposal for a regulation
Article 47 – title
Article 47 – title
European Board for Digital ServicesPlatform Agency
Amendment 2031 #
Proposal for a regulation
Article 47 – paragraph 1
Article 47 – paragraph 1
1. An independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary services named ‘European Board for Digital Servicesoversight body for providers of very large online platforms named ‘European Platform Agency’ (the ‘BoardAgency’) is established as a body of the Union and shall have legal personality.
Amendment 2032 #
Proposal for a regulation
Article 47 – paragraph 1 a (new)
Article 47 – paragraph 1 a (new)
1a. The Agency shall be responsible for all matters relating to the application and enforcement of this Regulation for very large online platforms, in accordance with the procedures laid out in Section 3 of this Regulation.
Amendment 2035 #
Proposal for a regulation
Article 47 – paragraph 2 – introductory part
Article 47 – paragraph 2 – introductory part
2. The BoardAgency shall advise the Digital Services Coordinators and the Commission in accordance with this Regulation to achieve the following objectives:
Amendment 2038 #
Proposal for a regulation
Article 47 – paragraph 2 – point a
Article 47 – paragraph 2 – point a
(a) Contributing to the consistent application of this Regulation and effective cooperation of the Digital Services Coordinators and the Commission with regard to matters covered by this Regulation;
Amendment 2042 #
Proposal for a regulation
Article 47 – paragraph 2 – point b
Article 47 – paragraph 2 – point b
(b) coordinating and contributing to guidance and analysis of the Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by this Regulation;
Amendment 2044 #
Proposal for a regulation
Article 47 – paragraph 2 – point c
Article 47 – paragraph 2 – point c
Amendment 2045 #
Proposal for a regulation
Article 47 – paragraph 2 a (new)
Article 47 – paragraph 2 a (new)
2a. In so far as is necessary in order to achieve the objectives set out in this Regulation, and without prejudice to the competence of the Member States and of the Union institutions, the Agency may cooperate with the competent authorities of third countries and with international organisations. To that end, the Agency may, subject to the authorisation of the Oversight Board and after the approval of the Commission, establish working arrangements with the competent authorities of third countries and with international organisations. Those arrangements shall not create legal obligations on the Union or the Member States.
Amendment 2047 #
Proposal for a regulation
Article 48 – title
Article 48 – title
Structure of the BoardAgency
Amendment 2048 #
Proposal for a regulation
Article 48 – paragraph 1
Article 48 – paragraph 1
1. The BoardAgency shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for themoperating part of the Agency and an Oversight Board.
Amendment 2053 #
Proposal for a regulation
Article 48 – paragraph 2
Article 48 – paragraph 2
Amendment 2055 #
Proposal for a regulation
Article 48 – paragraph 2 – subparagraph 2
Article 48 – paragraph 2 – subparagraph 2
Amendment 2058 #
Proposal for a regulation
Article 48 – paragraph 3
Article 48 – paragraph 3
Amendment 2062 #
Proposal for a regulation
Article 48 – paragraph 4
Article 48 – paragraph 4
Amendment 2064 #
Proposal for a regulation
Article 48 – paragraph 5
Article 48 – paragraph 5
5. The BoardAgency may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board Agency shall make the results of this cooperation publicly available.
Amendment 2071 #
Proposal for a regulation
Article 48 – paragraph 6
Article 48 – paragraph 6
6. The BoardAgency shall adopt its rules of procedure, following the consent of the Commission.
Amendment 2073 #
Proposal for a regulation
Article 48 a (new)
Article 48 a (new)
Amendment 2074 #
Proposal for a regulation
Article 48 b (new)
Article 48 b (new)
Amendment 2075 #
Proposal for a regulation
Article 48 c (new)
Article 48 c (new)
Amendment 2076 #
Proposal for a regulation
Article 48 d (new)
Article 48 d (new)
Article 48d Staff of the Agency 1. The Staff Regulations of Officials of the European Union, the Conditions of Employment of Other Servants and the rules adopted jointly by the institutions of the Union for the purposes of the application of those Staff Regulations and Conditions of Employment shall apply to the staff employed by the Agency. 2. The staff of the Agency shall consist of servants recruited by the Agency as necessary to perform its tasks. They shall have security clearances appropriate to the classification of the information they are handling. 3. The Agency’s internal rules, such as the rules of procedure of the Oversight Board, the financial rules applicable to the Agency, the rules for the application of the staff regulations and the rules for access to documents, shall ensure the autonomy and independence of staff.
Amendment 2077 #
Proposal for a regulation
Article 48 e (new)
Article 48 e (new)
Article 48e Headquarters agreement and operating conditions 1. The Agency shall be headquartered in Brussels, Belgium. 2. The necessary arrangements concerning the accommodation to be provided for the Agency in the host Member State, together with the specific rules applicable in the host Member State to the members of the Oversight Board, staff and members of their families, shall be laid down in a Headquarters agreement between the Agency and the Member State where the seat is located, to be concluded after obtaining the approval of the Management Board and no later than one year after this regulation enters into force. 3. The Agency’s host Member State shall provide the best possible conditions to ensure the smooth and efficient functioning of the Agency, including multilingual, European-oriented schooling and appropriate transport connections.
Amendment 2078 #
Proposal for a regulation
Article 48 f (new)
Article 48 f (new)
Article 48f Commencement of the Agency’s activities 1. The Agency shall become operational with the capacity to implement its own budget by the date on which this regulation enters into application. 2. The Commission shall be responsible for the establishment and initial operation of the Agency until the Agency becomes operational. For that purpose, until the Oversight Board takes up its duties following its appointment, the Commission may designate five Commission officials to act as an interim Oversight Board.
Amendment 2079 #
Proposal for a regulation
Article 49 – title
Article 49 – title
Amendment 2080 #
Proposal for a regulation
Article 49 – paragraph 1 – introductory part
Article 49 – paragraph 1 – introductory part
1. Where necessary to meet the objectives set out in Article 47(2), the BoardAgency shall in particular:
Amendment 2083 #
Proposal for a regulation
Article 49 – paragraph 1 – point c a (new)
Article 49 – paragraph 1 – point c a (new)
(ca) convene regular joint meetings of all Digital Service Coordinators for them to exchange on and coordinate their supervisory activities;
Amendment 2084 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
Article 49 – paragraph 1 – point d
Amendment 2094 #
Proposal for a regulation
Article 50 – title
Article 50 – title
Amendment 2095 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
Article 50 – paragraph 1 – subparagraph 1
Amendment 2098 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the BoardAgency acting on its own initiative, or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time perio of Section 4 of Chapter III, investigate the suspected infringement and communicate this decision to the very large online platform concerned.
Amendment 2102 #
Proposal for a regulation
Article 50 – paragraph 2
Article 50 – paragraph 2
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the BoardAgency shall request the very large online platform to draw up and communicate to the Agency, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35.
Amendment 2105 #
Proposal for a regulation
Article 50 – paragraph 3 – subparagraph 1
Article 50 – paragraph 3 – subparagraph 1
Within one month following receipt of the action plan, the Board shall communicate its opinion on the action plan to the Digital Services Coordinator of establishment. Within one month following receipt of that opinion, that Digital Services CoordinatorAgency shall decide whether the action plan is appropriate to terminate or remedy the infringement.
Amendment 2107 #
Proposal for a regulation
Article 50 – paragraph 3 – subparagraph 2
Article 50 – paragraph 3 – subparagraph 2
Where the Digital Services Coordinator of establishmentAgency has concerns on the ability of the measures to terminate or remedy the infringement, it may request the very large online platform concerned to subject itself to an additional, independent audit to assess the effectiveness of those measures in terminating or remedying the infringement. In that case, that platform shall send the audit report to that Digital Services Coordinator, the Commission and the Boarde Agency within four months from the decision referred to in the first subparagraph. When requesting such an additional audit, the Digital Services CoordinatorAgency may specify a particular audit organisation that is to carry out the audit, at the expense of the platform concerned, selected on the basis of criteria set out in Article 28(2).
Amendment 2109 #
Proposal for a regulation
Article 50 – paragraph 4 – subparagraph 1 – introductory part
Article 50 – paragraph 4 – subparagraph 1 – introductory part
4. The Digital Services Coordinator of establishment shall communicate to the Commission, the Board andAgency shall communicate to the very large online platform concerned its views as to whether the very large online platform has terminated or remedied the infringement and the reasons thereof. It shall do so within the following time periods, as applicable:
Amendment 2112 #
Proposal for a regulation
Article 50 – paragraph 4 – subparagraph 2
Article 50 – paragraph 4 – subparagraph 2
Amendment 2115 #
Proposal for a regulation
Article 51 – title
Article 51 – title
Amendment 2117 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board,Agency may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
Amendment 2124 #
Proposal for a regulation
Article 51 – paragraph 1 – point b
Article 51 – paragraph 1 – point b
(b) is suspected of having infringed any of the provisions of this Regulation and the Digital Services Coordinator of establishment requested the CommissionAgency to intervene in accordance with Article 46(2), upon the reception of that request;
Amendment 2126 #
Proposal for a regulation
Article 51 – paragraph 1 – point c a (new)
Article 51 – paragraph 1 – point c a (new)
(ca) has been found to not implement the operational recommendations from the independent audit as laid out in Article 28(4).
Amendment 2132 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Article 51 – paragraph 2 – subparagraph 1
Where the CommissionAgency decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
Amendment 2134 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 2
Article 51 – paragraph 2 – subparagraph 2
As regards points (a) and (b) of paragraph 1, pursuant to that notification, the Digital Services Coordinator of establishment concerned shall no longer be entitled to take any investigatory or enforcement measures in respect of the relevant conduct by the very large online platform concerned, without prejudice to Article 66 or any otherany measures that it may take at the request of the CommissionAgency.
Amendment 2135 #
Proposal for a regulation
Article 51 – paragraph 3 – introductory part
Article 51 – paragraph 3 – introductory part
3. The Digital Services Coordinator referred to in Articles 45(7), 46(2) and 50(1), as applicable, shall, without undue delay upon being informed, transmit to the CommissionAgency:
Amendment 2137 #
Proposal for a regulation
Article 51 – paragraph 3 – point a
Article 51 – paragraph 3 – point a
(a) any information that that Digital Services Coordinator exchanged relating to the infringement or the suspected infringement, as applicable, with the Board and with the very large online platform concerned;
Amendment 2138 #
Proposal for a regulation
Article 51 – paragraph 3 – point c
Article 51 – paragraph 3 – point c
(c) any other information in the possession of that Digital Services Coordinator that may be relevant to the proceedings initiated by the CommissionAgency.
Amendment 2139 #
Proposal for a regulation
Article 51 – paragraph 4
Article 51 – paragraph 4
4. The Board, and the Digital Services Coordinators making the request referred to in Article 45(1), shall, without undue delay upon being informed, transmit to the Commission any information in their possession that may be relevant to the proceedings initiated by the CommissionAgency.
Amendment 2144 #
Proposal for a regulation
Article 52 – paragraph 1
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the CommissionAgency may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period.
Amendment 2146 #
Proposal for a regulation
Article 52 – paragraph 2
Article 52 – paragraph 2
2. When sending a simple request for information to the very large online platform concerned or other person referred to in Article 52(1), the CommissionAgency shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which the information is to be provided, and the penalties provided for in Article 59 for supplying incorrect or misleading information.
Amendment 2149 #
Proposal for a regulation
Article 52 – paragraph 3
Article 52 – paragraph 3
3. Where the CommissionAgency requires the very large online platform concerned or other person referred to in Article 52(1) to supply information by decision, it shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which it is to be provided. It shall also indicate the penalties provided for in Article 59 and indicate or impose the periodic penalty payments provided for in Article 60. It shall further indicate the right to have the decision reviewed by the Court of Justice of the European Union.
Amendment 2152 #
Proposal for a regulation
Article 52 – paragraph 5
Article 52 – paragraph 5
5. At the request of the CommissionAgency, the Digital Services Coordinators and other competent authorities shall provide the CommissionAgency with all necessary information to carry out the tasks assigned to it under this Section.
Amendment 2154 #
Proposal for a regulation
Article 53 – paragraph 1
Article 53 – paragraph 1
In order to carry out the tasks assigned to it under this Section, the CommissionAgency may interview any natural or legal person which consents to being interviewed for the purpose of collecting information, relating to the subject-matter of an investigation, in relation to the suspected infringement or infringement, as applicable.
Amendment 2157 #
Proposal for a regulation
Article 54 – paragraph 1
Article 54 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the CommissionAgency may conduct on-site inspections at the premises of the very large online platform concerned or other person referred to in Article 52(1).
Amendment 2159 #
Proposal for a regulation
Article 54 – paragraph 2
Article 54 – paragraph 2
2. On-site inspections may also be carried out with the assistance of auditors or experts appointed by the CommissionAgency pursuant to Article 57(2).
Amendment 2161 #
Proposal for a regulation
Article 54 – paragraph 3
Article 54 – paragraph 3
3. During on-site inspections the CommissionAgency and auditors or experts appointed by it may require the very large online platform concerned or other person referred to in Article 52(1) to provide explanations on its organisation, functioning, IT system, algorithms, data- handling and business conducts. The CommissionAgency and auditors or experts appointed by it may address questions to key personnel of the very large online platform concerned or other person referred to in Article 52(1).
Amendment 2163 #
Proposal for a regulation
Article 54 – paragraph 4
Article 54 – paragraph 4
4. The very large online platform concerned or other person referred to in Article 52(1) is required to submit to an on-site inspection ordered by decision of the CommissionAgency. The decision shall specify the subject matter and purpose of the visit, set the date on which it is to begin and indicate the penalties provided for in Articles 59 and 60 and the right to have the decision reviewed by the Court of Justice of the European Union.
Amendment 2168 #
Proposal for a regulation
Article 55 – paragraph 1
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the CommissionAgency may, by decision, order interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement.
Amendment 2173 #
Proposal for a regulation
Article 56 – paragraph 1
Article 56 – paragraph 1
1. If, during proceedings under this Section, the very large online platform concerned offers commitments to ensure compliance with the relevant provisions of this Regulation, the CommissionAgency may by decision make those commitments binding on the very large online platform concerned and declare that there are no further grounds for action.
Amendment 2175 #
Proposal for a regulation
Article 56 – paragraph 2 – introductory part
Article 56 – paragraph 2 – introductory part
2. The CommissionAgency may, upon request or on its own initiative, reopen the proceedings:
Amendment 2178 #
Proposal for a regulation
Article 56 – paragraph 3
Article 56 – paragraph 3
3. Where the CommissionAgency considers that the commitments offered by the very large online platform concerned are unable to ensure effective compliance with the relevant provisions of this Regulation, it shall reject those commitments in a reasoned decision when concluding the proceedings.
Amendment 2183 #
Proposal for a regulation
Article 57 – paragraph 1
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the CommissionAgency may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The CommissionAgency may also order that platform to provide access to, and explanations relating to, its databases and algorithms.
Amendment 2185 #
Proposal for a regulation
Article 57 – paragraph 2
Article 57 – paragraph 2
2. The actions pursuant to paragraph 1 may include the appointment of independent external experts and auditors to assist the CommissionAgency in monitoring compliance with the relevant provisions of this Regulation and to provide specific expertise or knowledge to the CommissionAgency .
Amendment 2186 #
Proposal for a regulation
Article 57 a (new)
Article 57 a (new)
Article 57a Right to lodge a complaint with the Agency Article 43 shall also be applicable to complaints with the Agency in regard to its oversight and enforcement over the provisions of Section 4 of Chapter III.
Amendment 2189 #
Proposal for a regulation
Article 58 – paragraph 1 – introductory part
Article 58 – paragraph 1 – introductory part
1. The CommissionAgency shall adopt a non- compliance decision where it finds that the very large online platform concerned does not comply with one or more of the following:
Amendment 2193 #
Proposal for a regulation
Article 58 – paragraph 2
Article 58 – paragraph 2
2. Before adopting the decision pursuant to paragraph 1, the CommissionAgency shall communicate its preliminary findings to the very large online platform concerned. In the preliminary findings, the CommissionAgency shall explain the measures that it considers taking, or that it considers that the very large online platform concerned should take, in order to effectively address the preliminary findings.
Amendment 2196 #
Proposal for a regulation
Article 58 – paragraph 3
Article 58 – paragraph 3
3. In the decision adopted pursuant to paragraph 1 the CommissionAgency shall order the very large online platform concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within a reasonable time period and to provide information on the measures that that platform intends to take to comply with the decision.
Amendment 2200 #
Proposal for a regulation
Article 58 – paragraph 4
Article 58 – paragraph 4
4. The very large online platform concerned shall provide the CommissionAgency with a description of the measures it has taken to ensure compliance with the decision pursuant to paragraph 1 upon their implementation.
Amendment 2205 #
Proposal for a regulation
Article 58 – paragraph 5
Article 58 – paragraph 5
5. Where the CommissionAgency finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision.
Amendment 2208 #
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
Article 59 – paragraph 1 – introductory part
1. In the decision pursuant to Article 58, the CommissionAgency may impose on the very large online platform concerned fines not exceeding 610% of its total worldwide turnover in the preceding financial year where it finds that that platform, intentionally or negligently:
Amendment 2214 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
Article 59 – paragraph 2 – introductory part
2. The CommissionAgency may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 12% of the total worldwide turnover in the preceding financial year, where they intentionally or negligently:
Amendment 2217 #
Proposal for a regulation
Article 59 – paragraph 2 – point b
Article 59 – paragraph 2 – point b
(b) fail to rectify within the time period set by the CommissionAgency, incorrect, incomplete or misleading information given by a member of staff, or fail or refuse to provide complete information;
Amendment 2221 #
Proposal for a regulation
Article 59 – paragraph 3
Article 59 – paragraph 3
3. Before adopting the decision pursuant to paragraph 2, the CommissionAgency shall communicate its preliminary findings to the very large online platform concerned or other person referred to in Article 52(1).
Amendment 2225 #
Proposal for a regulation
Article 59 – paragraph 4
Article 59 – paragraph 4
4. In fixing the amount of the fine, the CommissionAgency shall have regard to the nature, gravity, duration and recurrence of the infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings.
Amendment 2226 #
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
Article 60 – paragraph 1 – introductory part
1. The CommissionAgency may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 510 % of the average daily worldwide turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to:
Amendment 2230 #
Proposal for a regulation
Article 60 – paragraph 2
Article 60 – paragraph 2
2. Where the very large online platform concerned or other person referred to in Article 52(1) has satisfied the obligation which the periodic penalty payment was intended to enforce, the CommissionAgency may fix the definitive amount of the periodic penalty payment at a figure lower than that which would arise under the original decision.
Amendment 2231 #
Proposal for a regulation
Article 61 – paragraph 1
Article 61 – paragraph 1
1. The powers conferred on the CommissionAgency by Articles 59 and 60 shall be subject to a limitation period of five years.
Amendment 2232 #
Proposal for a regulation
Article 61 – paragraph 3 – introductory part
Article 61 – paragraph 3 – introductory part
3. Any action taken by the CommissionAgency or by the Digital Services Coordinator for the purpose of the investigation or proceedings in respect of an infringement shall interrupt the limitation period for the imposition of fines or periodic penalty payments. Actions which interrupt the limitation period shall include, in particular, the following:
Amendment 2234 #
Proposal for a regulation
Article 61 – paragraph 3 – point a
Article 61 – paragraph 3 – point a
(a) requests for information by the CommissionAgency or by a Digital Services Coordinator;
Amendment 2236 #
Proposal for a regulation
Article 61 – paragraph 3 – point c
Article 61 – paragraph 3 – point c
(c) the opening of a proceeding by the CommissionAgency pursuant to Article 51(2).
Amendment 2238 #
Proposal for a regulation
Article 61 – paragraph 4
Article 61 – paragraph 4
4. Each interruption shall start time running afresh. However, the limitation period for the imposition of fines or periodic penalty payments shall expire at the latest on the day on which a period equal to twice the limitation period has elapsed without the CommissionAgency having imposed a fine or a periodic penalty payment. That period shall be extended by the time during which the limitation period is suspended pursuant to paragraph 5.
Amendment 2239 #
Proposal for a regulation
Article 61 – paragraph 5
Article 61 – paragraph 5
5. The limitation period for the imposition of fines or periodic penalty payments shall be suspended for as long as the decision of the CommissionAgency is the subject of proceedings pending before the Court of Justice of the European Union.
Amendment 2244 #
Proposal for a regulation
Article 62 – paragraph 1
Article 62 – paragraph 1
1. The power of the CommissionAgency to enforce decisions taken pursuant to Articles 59 and 60 shall be subject to a limitation period of five years.
Amendment 2245 #
Proposal for a regulation
Article 62 – paragraph 3 – point b
Article 62 – paragraph 3 – point b
(b) by any action of the CommissionAgency, or of a Member State acting at the request of the CommissionAgency, designed to enforce payment of the fine or periodic penalty payment.
Amendment 2247 #
Proposal for a regulation
Article 63 – paragraph 1 – introductory part
Article 63 – paragraph 1 – introductory part
1. Before adopting a decision pursuant to Articles 58(1), 59 or 60, the CommissionAgency shall give the very large online platform concerned or other person referred to in Article 52(1) the opportunity of being heard on:
Amendment 2248 #
Proposal for a regulation
Article 63 – paragraph 1 – point a
Article 63 – paragraph 1 – point a
(a) preliminary findings of the CommissionAgency , including any matter to which the CommissionAgency has taken objections; and
Amendment 2250 #
Proposal for a regulation
Article 63 – paragraph 1 – point b
Article 63 – paragraph 1 – point b
(b) measures that the CommissionAgency may intend to take in view of the preliminary findings referred to point (a).
Amendment 2252 #
Proposal for a regulation
Article 63 – paragraph 2
Article 63 – paragraph 2
2. The very large online platform concerned or other person referred to in Article 52(1) may submit their observations on the CommissionAgency’s preliminary findings within a reasonable time period set by the CommissionAgency in its preliminary findings, which may not be less than 14 days.
Amendment 2254 #
Proposal for a regulation
Article 63 – paragraph 3
Article 63 – paragraph 3
3. The CommissionAgency shall base its decisions only on objections on which the parties concerned have been able to comment.
Amendment 2257 #
Proposal for a regulation
Article 63 – paragraph 4
Article 63 – paragraph 4
4. The rights of defence of the parties concerned shall be fully respected in the proceedings. They shall be entitled to have access to the CommissionAgency's file under the terms of a negotiated disclosure, subject to the legitimate interest of the very large online platform concerned or other person referred to in Article 52(1) in the protection of their business secrets. The right of access to the file shall not extend to confidential information and internal documents of the CommissionAgency or Member States’ authorities. In particular, the right of access shall not extend to correspondence between the CommissionAgency and those authorities. Nothing in this paragraph shall prevent the CommissionAgency from disclosing and using information necessary to prove an infringement.
Amendment 2258 #
Proposal for a regulation
Article 63 – paragraph 6
Article 63 – paragraph 6
6. Without prejudice to the exchange and to the use of information referred to in Articles 51(3) and 52(5), the Commission, the BoardAgency, Member States’ authorities and their respective officials, servants and other persons working under their supervision,; and any other natural or legal person involved, including auditors and experts appointed pursuant to Article 57(2) shall not disclose information acquired or exchanged by them pursuant to this Section and of the kind covered by the obligation of professional secrecy.
Amendment 2259 #
Proposal for a regulation
Article 64 – paragraph 1
Article 64 – paragraph 1
1. The CommissionAgency shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed.
Amendment 2261 #
Proposal for a regulation
Article 65 – paragraph 1 – subparagraph 1
Article 65 – paragraph 1 – subparagraph 1
Where all powers pursuant to this Article to bring about the cessation of an infringement of this Regulation have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the Commission may request the Digital Services Coordinator of establishment of the very large online platform concerned toAgency may act pursuant to Article 41(3).
Amendment 2264 #
Proposal for a regulation
Article 65 – paragraph 1 – subparagraph 2
Article 65 – paragraph 1 – subparagraph 2
Prior to making such request to the Digital Services Coordinator, the Commissionsubmitting the request according to Article 41(3) (b), the Agency shall invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing the measures it intends to request and identifying the intended addressee or addressees thereof.
Amendment 2267 #
Proposal for a regulation
Article 65 – paragraph 2 – subparagraph 1
Article 65 – paragraph 2 – subparagraph 1
Where the coherent application of this Regulation so requires, the Commission, acting on its own initiative,Agency may submit written observations to the competent judicial authority referred to Article 41(3). With the permission of the judicial authority in question, it may also make oral observations.
Amendment 2269 #
Proposal for a regulation
Article 66 – paragraph 1
Article 66 – paragraph 1
Amendment 2270 #
Proposal for a regulation
Article 66 – paragraph 1 – point a
Article 66 – paragraph 1 – point a
Amendment 2271 #
Proposal for a regulation
Article 66 – paragraph 1 – point b
Article 66 – paragraph 1 – point b
Amendment 2272 #
Proposal for a regulation
Article 66 – paragraph 1 – point c
Article 66 – paragraph 1 – point c
Amendment 2273 #
Proposal for a regulation
Article 66 – paragraph 2
Article 66 – paragraph 2
Amendment 2275 #
Proposal for a regulation
Article 67 – paragraph 1
Article 67 – paragraph 1
1. The CommissionAgency shall establish and maintain a reliable and secure information sharing system supporting communications between Digital Services Coordinators, the Commission and the BoardAgency.
Amendment 2276 #
Proposal for a regulation
Article 67 – paragraph 2
Article 67 – paragraph 2
2. The Digital Services Coordinators, the Commission and the BoardAgency shall use the information sharing system for all communications pursuant to this Regulation.
Amendment 2277 #
Proposal for a regulation
Article 67 – paragraph 3
Article 67 – paragraph 3
Amendment 2278 #
Proposal for a regulation
Article 68 – paragraph 1 – introductory part
Article 68 – paragraph 1 – introductory part
Without prejudice to Directive 2020/XX/EU of the European Parliament and of the Council52 , recipients of intermediary services shall have the right to mandate a public body or their representatives, or a body, organisation or association to exercise the rights referred to in Articles 8, 12, 13, 14, 15, 17, 18, 19 and 1943 as well as all secondary claims on their behalf, provided the body, organisation or association meets all of the following conditions: __________________ 52 [Reference]
Amendment 2289 #
Proposal for a regulation
Article 72 a (new)
Article 72 a (new)