195 Amendments of Eva KAILI related to 2020/0361(COD)
Amendment 75 #
Proposal for a regulation
Recital 4
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, clear, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated.
Amendment 78 #
Proposal for a regulation
Recital 8
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, orctivities or on the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. _________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
Amendment 85 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable Union or national law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 89 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or, online marketplaces or search engines, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
Amendment 98 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and, automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider or where the provider of the service promotes and optimises the content.
Amendment 99 #
Proposal for a regulation
Recital 18 a (new)
Recital 18 a (new)
(18 a) The exemptions from liability should also not be available to providers of intermediary services that do not comply with the due diligence obligations set out in this Regulation. The conditionality should further ensure that the standards to qualify for those exemptions contribute to a high level of safety and trust in the online environment in a manner that promotes a fair balance of the rights of all stakeholders.
Amendment 100 #
Proposal for a regulation
Recital 20
Recital 20
(20) A provider of intermediary services that deliberatelywhose main purpose is to engage in or facilitate illegal activities that collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
Amendment 104 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expressions enshrined in the Charter of Fundamental Rights, including freedom of expression. Where the illegal content can cause significant public harm, the provider should assess and, when necessary, remove or disable access to that content within 24 hours and, in any case, not more than one hour after receiving a removal order from the competent authority. The provider can obtain such actual knowledge or awareness through, in particular, its periodic own- initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 107 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, unless they comply with a number of specific requirements set out in this Regulation, including the appointment of a legal representative in the Union, the implementation of notice and action mechanisms, the traceability of traders using their services, the provision of information on their online advertising and their recommender system practices and policy as well as transparency requirements towards the consumers as laid down in Directive 2011/83/EU. In addition, they should not be able to benefit from the exemption from liability for hosting service providers establishing in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
Amendment 113 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent mannera diligent manner and accompanied by additional safeguards. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union or national law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 115 #
Proposal for a regulation
Recital 26
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should also attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.
Amendment 116 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or neutral hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks or providers of services deeper in the internet stack, such as IT infrastructure services (on-premise, cloud-based and or hybrid hosting solutions), that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. Services deeper in the internet stack acting as online intermediaries could be required to take proportionate actions where the customer fails to remove the illegal content, unless technically impracticable.
Amendment 120 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact- finding obligation, or as a general obligation forimpeding upon the ability of providers to undertake proactive measures to relation to illegal contentidentify and remove illegal content and to prevent its reappearance.
Amendment 128 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and, by professional entities and by users of services which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
Amendment 129 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. Children have specific rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
Amendment 130 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. Providers of intermediary services that qualify as small or micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to stablish collective representation under the guidance of the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative.
Amendment 131 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easy to access and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider shall assess the illegality of the identified content and, based on that assessment, can decide whether or not it agrees with that assessme notification for illegal content and wishes to remove or disable access to that content ('action'). In the event that the provider of hosting services assesses the notice of illegal content to be positive and thus decides to remove or disable access to it, it shall ensure that such content remains inaccessible after take down. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 135 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. To ensure that the requirement to address such decisions to recipients of the service does not present an undue and disproportionate burden, providers of hosting services should be allowed to use automated means to draft the decisions and address them to the recipients of the service concerned. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 137 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should prevent the reappearance of the notified illegal information. The provider should also inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 140 #
Proposal for a regulation
Recital 43
Recital 43
Amendment 144 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which must ensure human review and meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint- handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner and within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 148 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content and are known to flag content frequently with a high rate of accuracy, that they represent collective interests and that they work in a diligent, objective and objeffective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry representing collective interests and of right- holders specifically created for that purpose could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions and ensure independent public interest representation. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43 Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 153 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 162 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned and by other intermediaries, such as advertising services, webhosting, domain name registrations, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 168 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising withat can have both an impact on the equal treatment and opportunities of citizens and on the perpetuation of harmful stereotypes and norms. Therefore, more transparency in online advertising markets and independent research needs to be carried out to assess the effectiveness of behavioural advertisements which could pave the way for stricter measures or restriction of behavioural advertising. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 171 #
Proposal for a regulation
Recital 52 a (new)
Recital 52 a (new)
(52 a) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
Amendment 186 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly and separately present the main parameters for such recommender systems in an clear, concise, accessible and easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient, and shall not make the recipients of their services subject to recommender systems based on profiling by default.
Amendment 187 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. Children have specific rights enshrined in Article 24 of the Charter of Fundamental Rights of the European Union and in the United Nations Convention on the Rights of the Child. As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
Amendment 188 #
Proposal for a regulation
Recital 63
Recital 63
Amendment 192 #
Proposal for a regulation
Recital 4 a (new)
Recital 4 a (new)
(4a) Online advertisement plays an important role in the online environment, including in relation to the provision of the information society services. However, certain forms of online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to creating financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, to misleading or exploitative marketing or the discriminatory display of advertising with an impact on the equal treatment and the rights of consumers. Consumers are largely unaware of the volume and granularity of the data that is being collected and used to deliver personalised and micro-targeted advertisements, and have little agency and limited ways to stop or control data exploitation. The significant reach of a few online platforms, their access to extensive datasets and participation at multiple levels of the advertising value chain has created challenges for businesses, traditional media services and other market participants seeking to advertise or develop competing advertising services. In addition to the information requirements resulting from Article 6 of Directive 2000/31/EC, stricter rules on targeted advertising and micro-targeting are needed, in favour of less intrusive forms of advertising that do not require extensive tracking of the interaction and behaviour of recipients of the service. Therefore, providers of information society services may only deliver and display online advertising to a recipient or a group of recipients of the service when this is done based on contextual information, such as keywords or metadata. Providers should not deliver and display online advertising to a recipient or a clearly identifiable group of recipients of the service that is based on personal or inferred data relating to the recipients or groups of recipients. Where providers deliver and display advertisement, they should be required to ensure that the recipients of the service have certain individualised information necessary for them to understand why and on whose behalf the advertisement is displayed, including sponsored content and paid promotion.
Amendment 213 #
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
Article 1 – paragraph 2 – point b a (new)
(b a) promote innovation and facilitate competition for digital services, while protecting users and consumers rights.
Amendment 214 #
Proposal for a regulation
Article 1 – paragraph 2 – point b b (new)
Article 1 – paragraph 2 – point b b (new)
(b b) stimulate the level playing field of the online ecosystem by introducing interoperability requirements for very large platforms.
Amendment 218 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(i a) Charter of Fundamental Rights of the European Union
Amendment 221 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
Article 2 – paragraph 1 – point d – introductory part
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union;, or in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as: where the provider targets its activities towards one or more Member States.
Amendment 222 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
Article 2 – paragraph 1 – point d – indent 1
Amendment 223 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
Article 2 – paragraph 1 – point d – indent 2
Amendment 230 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State that is consistent with Union law, irrespective of the precise subject matter or nature of that law;
Amendment 232 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation and govern themselves under specific terms and conditions.
Amendment 240 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, rank and prioritise information in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 241 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to rank, prioritise and suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 246 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that the provider:
Amendment 247 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
Article 4 – paragraph 1 – point a
(a) the provider does not modify the information;
Amendment 248 #
Proposal for a regulation
Article 4 – paragraph 1 – point b
Article 4 – paragraph 1 – point b
(b) the provider complies with conditions on access to the information;
Amendment 249 #
Proposal for a regulation
Article 4 – paragraph 1 – point c
Article 4 – paragraph 1 – point c
(c) the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry;
Amendment 250 #
Proposal for a regulation
Article 4 – paragraph 1 – point d
Article 4 – paragraph 1 – point d
(d) the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and
Amendment 252 #
Proposal for a regulation
Article 4 – paragraph 1 – point e
Article 4 – paragraph 1 – point e
(e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.
Amendment 253 #
(e a) the provider not only immediately deletes illegal content after positive identification, but also continuously transmits it to the law enforcement authorities for the purpose of further prosecution, including the metadata necessary for this purpose.
Amendment 262 #
Proposal for a regulation
Article 5 – paragraph 4 a (new)
Article 5 – paragraph 4 a (new)
4 a. Providers of intermediary services shall be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 when they do not comply with the due diligence obligations set out in this Regulation.
Amendment 267 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Article 6 – paragraph 1 a (new)
Providers of intermediary services shall ensure that voluntary investigations or activities are accompanied with appropriate safeguards, such as human oversight, to ensure they are transparent, fair and non-discriminatory.
Amendment 268 #
Proposal for a regulation
Article 7 – title
Article 7 – title
No general monitoring or active fact- finding obligations without undermining the obligation to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk
Amendment 284 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
Article 11 – paragraph 5 a (new)
5 a. Providers of intermediary services that qualify as small or micro enterprises within the meaning of the Annex to Recommendation2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall be able to stablish collective representation under the guidance of the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative.
Amendment 287 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including the nature, purpose, modalities and data sets used in algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 288 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they imposethe activities undertaken by them in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 290 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective, necessary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 291 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective, necessary and proportionate manner in applying and enforcing the restrictionactivities referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 303 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. Internal complaint-handling systems (appeals systems) shall be mandatory and shall be detailed in the transparency reports.
Amendment 305 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 a (new)
Article 13 – paragraph 1 – subparagraph 1 a (new)
Intermediaries should explain mechanisms, processes and tools that alert them to potential breaches of its rules. Provisions about the easiness of reporting for vulnerable or atypical social groups (i.e. children, elderly, disabled) should be detailed;
Amendment 306 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 b (new)
Article 13 – paragraph 1 – subparagraph 1 b (new)
Intermediaries should explain mechanisms, processes and tools that drive decision-making regarding actions related to content (i.e. take-down, suspensions etc). These should include provisions for the well being of moderators when human moderation is deployed;
Amendment 307 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 c (new)
Article 13 – paragraph 1 – subparagraph 1 c (new)
Intermediaries should explain mechanisms, processes and tools that enable the promotion or suppression of content;
Amendment 308 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 d (new)
Article 13 – paragraph 1 – subparagraph 1 d (new)
Intermediaries should explain mechanisms, processes and tools by means of which flaggers and content creators are notified about the evolution and outcome of the company’s decisions related to reported content;
Amendment 309 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 e (new)
Article 13 – paragraph 1 – subparagraph 1 e (new)
Intermediaries should explain mechanisms, processes, and tools by means of which decisions can be challenged;
Amendment 310 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 f (new)
Article 13 – paragraph 1 – subparagraph 1 f (new)
Intermediaries should explain what human and other resources are applied to moderating content;
Amendment 311 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 g (new)
Article 13 – paragraph 1 – subparagraph 1 g (new)
Intermediaries should explain how content management processes and policies are reviewed, scrutinised and revised;
Amendment 312 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 h (new)
Article 13 – paragraph 1 – subparagraph 1 h (new)
Intermediaries should disclose information regarding provisions and resources in place to safeguard users human rights including their rights to privacy, safety, accessibility and information. Children’s rights as described in the UN Convention on the Rights of the Child and the Convention’s General Comment 25, in particular should be embedded in the product and service design processes;
Amendment 313 #
Proposal for a regulation
Article 13 – paragraph 1 – subparagraph 1 i (new)
Article 13 – paragraph 1 – subparagraph 1 i (new)
Intermediaries should report on their plans and resources available to tackle emerging harms.
Amendment 317 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 321 #
Proposal for a regulation
Article 14 – paragraph 1 a (new)
Article 14 – paragraph 1 a (new)
1 a. Providers of intermediary services shall act expeditiously, including and especially for time-sensitive content, upon receipt of a notice flagging illegal content or otherwise gaining knowledge or awareness of illegal activity, or illegal content, by taking the content down and also prevent content that has been taken down from reappearing (i.e. stay-down).
Amendment 323 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify and assess the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 324 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) where necessary, an explanation of the reasons why the individual or entity considers the information in question to be illegal content;
Amendment 330 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exactsuch as the URL or URLs, andor, where necessary, additional information enabling the identification of the illegal content;
Amendment 332 #
Proposal for a regulation
Article 14 – paragraph 2 – point d
Article 14 – paragraph 2 – point d
(d) a statement confirming the good faith beliefbest knowledge of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.
Amendment 343 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. Providers of hosting services using automated means for content moderation or decision-making should at least inform affected individuals about the procedure followed, the technology used and the criteria and reasoning supporting the decision, without prejudice to the duty to inform and the rights of data subjects under Regulation (EU) 2016/679.
Amendment 345 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. Providers of hosting services shall ensure that content previously identified as illegal following the mechanisms in paragraphs 1 and 2, remain inaccessible after take down.
Amendment 345 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as health – including mental health, the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 349 #
Proposal for a regulation
Article 15 – paragraph 1 a (new)
Article 15 – paragraph 1 a (new)
1 a. Providers of hosting services shall, by default, not make the recipients of their services subject to advertisement that is based on the processing of personal data as defined in Regulation (EU) 2016/679 to determine the recipient or the recipients to whom the advertisement is displayed.
Amendment 350 #
Proposal for a regulation
Article 15 – paragraph 1 b (new)
Article 15 – paragraph 1 b (new)
1 b. Providers of hosting services may give the recipients of their services the option to receive advertisements that are based on the processing of their personal data. For this purpose only such personal data may be processed, which data subjects have directly and actively provided to the hosting service provider and for the specific purpose of receiving personalised advertisements, provided the conditions for consent laid down in Regulation (EU) 2016/679 have been met, in particular Article 4(11) and Article 7.
Amendment 357 #
Proposal for a regulation
Article 15 – paragraph 2 – point f
Article 15 – paragraph 2 – point f
(f) information on the rights of the content provider and redress possibilities available to the recipient of the service in respect of the decision, in particular through internal complaint- handling mechanisms, out-of-court dispute settlement and judicial redress.
Amendment 375 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platformsAll providers of hosting services, shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint- handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 379 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or disable access to the information or not;
Amendment 380 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions to suspend or terminate or not the provision of the service, in whole or in part, to the recipients;
Amendment 381 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions to suspend or terminate the recipients’ account or not.
Amendment 382 #
Proposal for a regulation
Article 17 – paragraph 2
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints and include human review.
Amendment 386 #
Proposal for a regulation
Article 17 – paragraph 3 a (new)
Article 17 – paragraph 3 a (new)
3 a. The complaint mechanism to be established is without prejudice to the rights and remedies available to data subjects in accordance with Regulation (EU) 2016/679 and Directive 2002/58/EC.
Amendment 394 #
Proposal for a regulation
Article 18 – paragraph 2 – point d
Article 18 – paragraph 2 – point d
(d) it is capable of settling dispute in a swift, efficient, transparent and cost- effective manner and in at least one official language of the Union;
Amendment 397 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6 a. Any attempt to reach an out-of- court agreement on the settlement of a dispute in accordance with this Article shall not affect the rights of the providers of online platform services and of the recipients of the service concerned to initiate judicial proceedings at any time before, during or after the out-of-court dispute settlement process.
Amendment 398 #
Proposal for a regulation
Article 18 – paragraph 6 b (new)
Article 18 – paragraph 6 b (new)
6 b. Α ‘European Online Content Dispute Settlement Fund’ should be established to be managed by the EU. The fund shall be independent and its financial resources may come from administrative fines imposed under the DSA and contributions by the EU, member states and other stakeholders.
Amendment 401 #
Proposal for a regulation
Article 19 – paragraph 1 a (new)
Article 19 – paragraph 1 a (new)
1 a. Under certain cases such as cases based on existing internal systems or depending on urgencies, the regime of trusted flaggers should allow to exceptionally prioritise other notices in order to increase efficiency and involvement of all actors.
Amendment 403 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competencdemonstrated particular competence, accuracy and expertise for the purposes of detecting, identifying and notifying illegal content;
Amendment 407 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests, ensures independent public interest representation and is independent from any online platform;
Amendment 408 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner and is duly accredited.
Amendment 409 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent andn objective manner.
Amendment 410 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. Digital Services Coordinators shall engage in dialogue with platforms and rights holders for maintaining the accuracy and efficacy of a trusted flagger system.
Amendment 412 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Article 12 a Child impact assesment 1. All providers must assess whether their services are accessed by, likely to be accessed by or impact on children, defined as persons under the age of 18. Providers of services likely to be accessed by or impact on children shall identify, analyse and assess, during the design and development of new services and at least once a year thereafter, any systemic risks stemming from the functioning and use made of their services in the Union by children. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments should also include the following systemic risks: a. the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; b. any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment; c. any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection or rights of children; 2. When conducting child impact assessments, providers of intermediary services likely to impact children shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child.
Amendment 414 #
Proposal for a regulation
Article 12 b (new)
Article 12 b (new)
Article 12 b Mitigation of risks to children Providers of intermediary services likely to impact children shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 13 (12 a new). Such measures shall include, where applicable: a. implementing mitigation measures identified in Article 27 with regard for children’s best interests; b. adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; c. implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; d. adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child; e. ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; f. preventing profiling, including for commercial purposes like targeted advertising; g. ensuring published terms are age appropriate and uphold children’s rights; h. providing child-friendly mechanisms for remedy and redress, including easy access to expert advice and support;
Amendment 416 #
Proposal for a regulation
Article 19 – paragraph 7
Article 19 – paragraph 7
7. The Commission, after consulting the Board, mayshall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 5 and 6.
Amendment 418 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestlyillegal content. A termination of the service can be issued in case the recipients fail to comply with the applicable provisions set out in this Regulation or in case the suspension has occurred at least 3 times following verification of the repeated provision of illegal content.
Amendment 422 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the age of complainants (if minors), the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 426 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 427 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
Article 13 – paragraph 2 a (new)
2 a. Providers of intermediary services that impact on children shall publish, at least once a year: a. child impact assessments to identify known harms, unintended consequences and emerging risk, pursuant to Article 13 (Art. 12 a new).The child impact assessments must comply with the standards outlined in Article 34; b. clear, easily comprehensible and detailed reports outlining the child risk mitigation measures undertaken pursuant to Article 14, their efficacy and any outstanding actions required. These reports must comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child-centred design. The content of these reports must be verifiable by independent audit; data sets and source code must be made available at the request of the regulator.
Amendment 429 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
Amendment 430 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 432 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
Article 20 – paragraph 3 – point d
Amendment 443 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, be it business-to-consumer or peer-to peer, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
Amendment 451 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
Article 22 – paragraph 1 – point f
Amendment 479 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
Amendment 485 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Article 24 – paragraph 1 – introductory part
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and, in real time at all times:
Amendment 487 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
Article 24 – paragraph 1 – point a
(a) that the information displayed is anor parts thereof is an online advertisement;
Amendment 489 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
Amendment 491 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
Article 24 – paragraph 1 – point c
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed with the same level of detail as specified by the advertiser, and including the optimisation objectives for the delivery of the advertisement as specified by the advertiser or applied by the online platform.
Amendment 494 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
Article 24 – paragraph 1 – point c
(c) clear meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed.
Amendment 496 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
Article 24 – paragraph 1 – point c a (new)
(c a) whether the advertisement was selected using an automated system and, in that case, the identity of the natural or legal person responsible for the system.
Amendment 499 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
Providers of intermediary services shall inform the natural or legal person on whose behalf the advertisement is displayed where the advertisement has been displayed. They shall also inform public authorities, non-governmental organisations and researchers, upon their request.
Amendment 501 #
Behavioural and micro-targeted advertising should not be permitted towards children below 18.
Amendment 502 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Article 24 – paragraph 1 b (new)
Online platforms shall favour advertising that do not require any tracking of user interaction with content.
Amendment 503 #
Proposal for a regulation
Article 24 – paragraph 1 c (new)
Article 24 – paragraph 1 c (new)
Online platforms shall offer the possibility to easily opt-out for micro-targeted tracking.
Amendment 504 #
Proposal for a regulation
Article 24 – paragraph 1 d (new)
Article 24 – paragraph 1 d (new)
Online platforms shall offer the possibility to opt-in for the use of behavioural data and political advertising.
Amendment 507 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3, or with a turnover of over EUR 50 million per year.1a _________________ 1aCommission Staff Working Document. Impact Assessment Report. Annexes. (SWD(2020)348).
Amendment 514 #
Proposal for a regulation
Article 25 – paragraph 4 a (new)
Article 25 – paragraph 4 a (new)
4 a. Very large platforms shall allow business users and providers of ancillary services access to and interoperability with the same operating system, hardware or software features that are available or used in the provision by the gatekeeper of any ancillary services.
Amendment 515 #
Proposal for a regulation
Article 25 – paragraph 4 b (new)
Article 25 – paragraph 4 b (new)
4 b. Gatekeepers of very large platforms shall allow the installation and effective use of third party software applications or software application stores using, or interoperating with, operating systems of that gatekeeper and allow these software applications or software application stores to be accessed by means other than the core platform services of that gatekeeper. The gatekeeper shall not be prevented from taking proportionate measures to ensure that third party software applications or software application stores do not endanger the integrity of the hardware or operating system provided by the gatekeeper.
Amendment 516 #
Proposal for a regulation
Article 25 – paragraph 4 c (new)
Article 25 – paragraph 4 c (new)
4 c. Very large platforms shall refrain from technically restricting the ability of end users to switch between and subscribe to different software applications and services to be accessed using the operating system of the gatekeeper, including as regards the choice of Internet access provider for end users.
Amendment 517 #
Proposal for a regulation
Article 25 – paragraph 4 d (new)
Article 25 – paragraph 4 d (new)
4 d. Very large platforms shall allow consumers and developers in mobile application ecosystems to increase the number of applications available and ensure new functionalities across software applications and services to be accessed using the operating systems of the gatekeeper.
Amendment 526 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services and activities, including business model-driven practices and technology design decisions, in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 528 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and activities and shall include the following systemic risks:
Amendment 533 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, including the right to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child and consumer protection, as enshrined in Articles 7, 11, 21, 24 and 248 of the Charter respectively;
Amendment 535 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, including the rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 542 #
Proposal for a regulation
Article 26 – paragraph 1 – point c a (new)
Article 26 – paragraph 1 – point c a (new)
(c a) the dissemination of disinformation through their services.
Amendment 544 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content, the potential infringement of consumer rights and of information that is incompatible with their terms and conditions.
Amendment 551 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to tho prevent and mitigate specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 552 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored toeasures to cease, prevent and mitigate the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 554 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services and activities, or their terms and conditions;
Amendment 564 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to cease, prevent and mitigate the systemic risks identified.
Amendment 568 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, mayshall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
Amendment 569 #
Proposal for a regulation
Article 27 – paragraph 3 a (new)
Article 27 – paragraph 3 a (new)
3 a. The option of a phase-out leading to a prohibition of targeted advertising on the basis of pervasive tracking shall also be considered.
Amendment 572 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms and smaller size online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:
Amendment 581 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 585 #
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
Article 28 – paragraph 2 – point c a (new)
(c a) have proven expertise and track record in evaluating processes of content moderation.
Amendment 586 #
Proposal for a regulation
Article 28 – paragraph 3 – introductory part
Article 28 – paragraph 3 – introductory part
3. The organisations that perform the audits shall establish an audit report for each audit. Auditors shall be trusted organisations within the new regulatory system. They shall have sufficient policy expertise and experience from conducting audits developing accurate and reliable reports that will address national and regional requirements. Auditors shall be independent of industry and government. The report shall be in writing and include at least the following:
Amendment 593 #
Proposal for a regulation
Article 29 – paragraph -1 (new)
Article 29 – paragraph -1 (new)
-1. Online platforms that use recommender systems shall indicate visibly to their recipients that the platform uses recommender systems.
Amendment 594 #
Proposal for a regulation
Article 29 – paragraph -1 a (new)
Article 29 – paragraph -1 a (new)
-1 a. Online platforms shall ensure that the option activated by default for the recipient of the service is not based on profiling within the meaning of Article 4(4) of Regulation (EU) 2016/679.
Amendment 598 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large oOnline platforms that use recommender systems shall set out in their terms and conditions, in a clearseparately the information concerning the role and functioning of recommender systems, in a clear for average users, concise, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as anyoffer controls with the available options for the recipients of the service to modifyin a user-friendly manner to modify, customize or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. basic natural criteria such as time, topics of interest, etc.
Amendment 599 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main technical parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
Amendment 601 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1 a. The parameters referred to in paragraph 1 shall include, at a minimum: (a) whether the recommender system is an automated system and, in that case, the identity of the natural or legal person responsible for the recommender system, if different from the platform provider; (b) clear information about the criteria used by recommender systems; (c) the relevance and weight of each criteria which leads to the information recommended; (e) what goals the relevant system has been optimised for, (d) if applicable, explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs.
Amendment 604 #
Proposal for a regulation
Article 29 – paragraph 2
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible and user-friendly functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
Amendment 608 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2 a. Very large online platforms shall offer users the choice of recommender systems from third party providers where available. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the very large online platform of its own recommender systems.
Amendment 612 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
Article 29 – paragraph 2 b (new)
2 b. Very large online platforms may only limit access to third party recommender systems temporarily in cases of demonstrable abuse by the third party provider or when justified by an immediate requirement to address technical problems such as a serious security vulnerability.
Amendment 614 #
Proposal for a regulation
Article 29 – paragraph 2 c (new)
Article 29 – paragraph 2 c (new)
Additional online advertising transparencytransparency for online advertisements and deep fakes
Amendment 618 #
Proposal for a regulation
Article 30 – paragraph 2 – point c a (new)
Article 30 – paragraph 2 – point c a (new)
(c a) data regarding the amount of spending;
Amendment 619 #
Proposal for a regulation
Article 30 – paragraph 2 – point d a (new)
Article 30 – paragraph 2 – point d a (new)
(d a) whether one or more particular groups of recipients of the service have been explicitly excluded from the advertisement target group;
Amendment 621 #
Proposal for a regulation
Article 30 – paragraph 2 – point e a (new)
Article 30 – paragraph 2 – point e a (new)
(e a) whether one or more particular groups of recipients of the service were excluded from the advertisement target group.
Amendment 626 #
Proposal for a regulation
Article 31 – paragraph 1
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation and to data that verify the effectiveness of the risk mitigation measures. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
Amendment 627 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide information and access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conductingfacilitating and conducting public interest research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). and to enable verification of the effectiveness and proportionality of the mitigation measures as set out in Article 27(1).
Amendment 629 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researcherpublic interest researchers, civil society representatives and journalists who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1).
Amendment 631 #
Proposal for a regulation
Article 31 – paragraph 3 a (new)
Article 31 – paragraph 3 a (new)
3 a. Very large online platforms shall provide effective portability of data generated through the activity of a business user or end user and shall, in particular, provide tools for end users to facilitate the exercise of data portability, in line with Regulation EU 2016/679, including by the provision of continuous and real-time access;
Amendment 632 #
Proposal for a regulation
Article 31 – paragraph 3 b (new)
Article 31 – paragraph 3 b (new)
3 b. Very large online platforms shall provide business users, or third parties authorised by a business user, free of charge, with effective, high-quality, continuous and real-time access and use of aggregated or non-personal aggregated data, that is provided for or generated in the context of the use of the relevant core platform services by those business users and the end users engaging with the products or services provided by those business users; for personal data, provide access and use, in full compliance with GDPR, only where directly connected with the use effectuated by the end user in respect of the products or services offered by the relevant business user through the relevant core platform service, and when the end user opts in to such sharing with a consent in the sense of Regulation (EU) 2016/679; the functionalities for giving information and offering the opportunity to grant consent shall be as user-friendly as possible.
Amendment 633 #
Proposal for a regulation
Article 31 – paragraph 3 c (new)
Article 31 – paragraph 3 c (new)
3 c. The data provided to vetted researchers shall be as disaggregated as possible, unless the researcher requests it otherwise.
Amendment 634 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interestscivil society organisations or think tanks representing the public interest, be independent from commercial interests, disclose the funding financing the research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 637 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the legal and technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 647 #
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
Article 34 – paragraph 1 – introductory part
1. The Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies on the basis of a multi-stakeholder, transparent and inclusive process at least for the following:
Amendment 648 #
Proposal for a regulation
Article 34 – paragraph 2
Article 34 – paragraph 2
2. The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The Commission shall support the drawing up at European level of technical standards on interoperability.
Amendment 655 #
Proposal for a regulation
Article 35 – paragraph 3 a (new)
Article 35 – paragraph 3 a (new)
3 a. The codes of conduct should contain a 'fairness-by-design' duty. This duty shall ensure that digital platforms design choice architecture in away that encourages free and informed decision making by consumers, with a requirement to trial and test alternative approaches. The duty shall require platforms to ensure that information and options are clear and easy to find and that information and options are presented in a fair way, enabling users to form their own opinions, and that users are enabled to make choices that want to make now and respect their choices including their ability to change their decisions.
Amendment 658 #
Proposal for a regulation
Article 36 – paragraph 3
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Codes should contain clear and precise consumer protection and human rights objectives, effective and dissuasive sanctions and be governed in a transparent manner. The effectiveness of the codes of conduct should be regularly assessed and possible legal options should be examined.
Amendment 665 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 2
Article 38 – paragraph 2 – subparagraph 2
For that purpose, Digital Services Coordinators shall cooperate with each other, other national competent authorities, the Board and the Commission, without prejudice to the possibility for Member States to provide for regular exchanges of views with other authorities where relevant for the performance of the tasks of those other authorities and of the Digital Services Coordinator, including sharing information on cross-border cases and providing support for each other during ongoing interventions and investigations.
Amendment 666 #
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
Article 2 – paragraph 1 – point d a (new)
(da) ‘child’ means any natural person under the age of 18;
Amendment 671 #
Proposal for a regulation
Article 39 – paragraph 1
Article 39 – paragraph 1
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, independent, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequatell necessary technical, financial and human resources (including skill and competence building) and infrastructure to carry out their tasks.
Amendment 675 #
Proposal for a regulation
Article 41 – paragraph 2 – point e a (new)
Article 41 – paragraph 2 – point e a (new)
(e a) the power to order the prohibition on the deployment of open content recommendation systems at least until compliance is guaranteed and the fundamental rights of users are sufficiently protected.
Amendment 677 #
Proposal for a regulation
Article 41 – paragraph 6
Article 41 – paragraph 6
6. Member States shall ensure that any exercise of the powers pursuant to paragraphs 1, 2 and 3 is subject to adequate safeguards laid down in the applicable national law in conformity with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties. The degree of enforcement should be commensurate with the degree of market power that the provider of intermediary services has.
Amendment 681 #
Proposal for a regulation
Article 42 – paragraph 3
Article 42 – paragraph 3
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or global turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or global turnover of the provider concerned.
Amendment 683 #
Proposal for a regulation
Article 42 – paragraph 4
Article 42 – paragraph 4
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the periodic penalty payment. The methodology shall specify, in particular, the criteria and parameters to calculate the penalties imposed at the national level.
Amendment 684 #
Proposal for a regulation
Article 42 – paragraph 4
Article 42 – paragraph 4
4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily global turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.
Amendment 689 #
Proposal for a regulation
Article 47 – paragraph 1
Article 47 – paragraph 1
1. An independent advisory and regulatory group of Digital Services Coordinators on the supervision of providers of intermediary services named ‘European Board for Digital Services’ (the ‘Board’) is established.
Amendment 690 #
Proposal for a regulation
Article 47 – paragraph 2 – point c a (new)
Article 47 – paragraph 2 – point c a (new)
(c a) facilitating the creation of a joint decision-making process to reach mutual agreement among independent national regulators on further actions.
Amendment 691 #
Proposal for a regulation
Article 47 – paragraph 2 – point c b (new)
Article 47 – paragraph 2 – point c b (new)
(c b) monitoring the compliance of online platforms with the requirements for meaningful transparency.
Amendment 692 #
Proposal for a regulation
Article 47 – paragraph 2 – point c c (new)
Article 47 – paragraph 2 – point c c (new)
(c c) conducting human rights impact assessments to ensure platforms' compliance with transparency safeguards established by the DSA legislative framework;
Amendment 693 #
Proposal for a regulation
Article 47 – paragraph 2 – point c d (new)
Article 47 – paragraph 2 – point c d (new)
(c d) performing fundamental rights auditing of platforms content recommendation systems, advertising and microtargeting, and content moderation;
Amendment 694 #
Proposal for a regulation
Article 47 – paragraph 2 – point c e (new)
Article 47 – paragraph 2 – point c e (new)
(c e) enabling and supervise the data access framework dedicated to research for public interest. Facilitating the exercise of the right of an individual (or a smaller online platform) to directly challenge a decision of a very large online platforms to remove content.
Amendment 696 #
Proposal for a regulation
Article 48 – paragraph 3
Article 48 – paragraph 3
3. The Board shall be chaired by the Commission. The Commissionan independent authority and assisted by a secretariat. The authority shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure.
Amendment 698 #
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
Article 49 – paragraph 1 – point e a (new)
(e a) issue own-initiative opinions;
Amendment 700 #
Proposal for a regulation
Article 49 – paragraph 1 – point e b (new)
Article 49 – paragraph 1 – point e b (new)
(e b) issue opinions on matters other than measures taken by the Commission.
Amendment 704 #
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
Article 59 – paragraph 1 – introductory part
1. In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total global turnover in the preceding financial year where it finds that thate platform, intentionally or negligently:
Amendment 705 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
Article 59 – paragraph 2 – introductory part
2. The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total global turnover in the preceding financial year, where they intentionally or negligently:
Amendment 706 #
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
Article 60 – paragraph 1 – introductory part
1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily global turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to:
Amendment 708 #
2. Those implementing acts shall be adopted in accordance with the advisoryexamination procedure referred to in Article 70. Before the adoption of any measures pursuant to paragraph 1, the Commission shall publish a draft thereof and invite all interested parties to submit their comments within the time period set out therein, which shall not be less than one month.
Amendment 709 #
Proposal for a regulation
Article 67 – paragraph 3
Article 67 – paragraph 3
3. The Commission shall adopt implementing acts laying down the practical and operational arrangements for the functioning of the information sharing system and its interoperability with other relevant systems. Those implementing acts shall be adopted in accordance with the advisoryexamination procedure referred to in Article 70.
Amendment 712 #
Proposal for a regulation
Article 69 – paragraph 2
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation]. It is of particular importance that the Commission carry out appropriate consultations during its preparatory work, including at expert level. The Commission, when preparing and drawing up delegated acts, shall ensure a simultaneous, timely and appropriate transmission of relevant documents to the European Parliament and to the Council.
Amendment 713 #
Proposal for a regulation
Article 73 – paragraph 1
Article 73 – paragraph 1
1. By five years after the entry into force of this Regulation at the latest, and every fiveteo years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee.
Amendment 715 #
3. In carrying out the evaluations referred to in paragraph 1, the Commission shall take into account the positions and findings of the European Parliament, the Council, the Board and other relevant bodies or sources. The evaluation procedure shall be based on a broad and inclusive consultation process including the voice of users and consumers. The Commission shall evaluate this Regulation together with the implementing and delegated acts referred to in this Directive, and shall submit the results of the evaluation to the European Parliament and the Council no later than…
Amendment 746 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Amendment 772 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
Article 34 – paragraph 1 a (new)
1 a. 2 (new).The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child, observance of which, once adopted, will be mandatory, at least for the following: a. age assurance and age verification pursuant to Articles 12 a (new) and 12 b (new) and 13; b. child impact assessments pursuant to Articles 12 a (new) and 13; c. age-appropriate terms and conditions pursuant to Article 12; d. child-centred design pursuant to Articles 12 b (new) and 13.
Amendment 937 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall ensure their terms and conditions are age-appropriate and meet the highest European or International standards, pursuant to Article 34.
Amendment 968 #
Proposal for a regulation
Article 12 a (new)
Article 12 a (new)
Article 12a Child impact assessment 1. All providers must assess whether their services are accessed by, likely to be accessed by or impact on children. Providers of services likely to be accessed by or impact on children shall identify, analyse and assess, during the design and development of new services, on an ongoing basis and at least once a year thereafter, any systemic risks stemming from the functioning and use made of their services in the Union by children. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments should also include the following systemic risks: (a) the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; (b) any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No.25 as regards the digital environment; (c) any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection or rights of children; 2. When conducting child impact assessments, providers of intermediary services likely to impact children shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child.
Amendment 973 #
Proposal for a regulation
Article 12 b (new)
Article 12 b (new)
Article 12b Mitigation of risks to children Providers of intermediary services likely to impact children shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 13 (12 a new). Such measures shall include, where applicable: (a) implementing mitigation measures identified in Article 27 with regard for children’s best interests; (b) adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; (c) implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; (d) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child; (e) ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; (f) preventing profiling, including for commercial purposes like targeted advertising; (g) ensuring published terms are age appropriate and uphold children’s rights; (h) providing child-friendly mechanisms for remedy and redress, including easy access to expert advice and support;
Amendment 992 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the age of complainants (if children), the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 997 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
Article 13 – paragraph 1 a (new)
1a. Providers of intermediary services that impact on children shall publish, at least once a year: (a) child impact assessments to identify known harms, unintended consequences and emerging risk. The child impact assessments must comply with the standards outlined in Article 34; (b) clear, easily comprehensible and detailed reports outlining the child risk mitigation measures undertaken, their efficacy and any outstanding actions required. These reports must comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child- centred design.
Amendment 1178 #
Proposal for a regulation
Article 17 – paragraph 2
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling and redress systems are easy to access, and user-friendly, including for children, and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.
Amendment 1509 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
2. The profiling of children for commercial purposes, including targeted or pernolised advertising, is prohibited in compliance with the industry-standards laid down in Article 34 and Regulation (EU) 2016/679.