93 Amendments of Tiemo WÖLKEN related to 2020/0361(COD)
Amendment 96 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and, the freedom to conduct a business, andprivacy and personal data protection, the right to non-discrimination and access to justice.
Amendment 112 #
Proposal for a regulation
Recital 10
Recital 10
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , Directive 2013/11/EC of the European Parliament and of the Council, Directive 2006/123/EC of the European Parliament and of the Council, and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions. _________________ 30Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). 31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). 32Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37. 33Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. 34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 35Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. 36Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. 37Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules 38Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
Amendment 124 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating toand cover illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relateis not in compliance with Union law since it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 149 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the Charter of Fundamental Rights of the European Union, including the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 158 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner and accompanied by additional safeguards. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 168 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content or as an obligation to use automated content- filtering tools.
Amendment 178 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. In this context and to maintain proportionality, orders addressed to a provider that has its main establishment or legal representation in another Member State or outside the Union should be limited to the Member State issuing the order, unless the legal basis for the order is directly applicable Union law.
Amendment 188 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant and up-to- date information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
Amendment 190 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. In particular, it is important to ensure that terms and conditions are fair, non- discriminatory and transparent, and are drafted in a clear and unambiguous language in line with applicable Union law. The terms and conditions should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, human review, the legal consequences to be faced by the users for knowingly storing or uploading illegal content as well as on the right to terminate the use of the service. Providers of intermediary services should also provide recipients of services with a concise and easily readable summary of the main elements of the terms and conditions, including the remedies available.
Amendment 192 #
Proposal for a regulation
Recital 4 a (new)
Recital 4 a (new)
(4a) Online advertisement plays an important role in the online environment, including in relation to the provision of the information society services. However, certain forms of online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to creating financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, to misleading or exploitative marketing or the discriminatory display of advertising with an impact on the equal treatment and the rights of consumers. Consumers are largely unaware of the volume and granularity of the data that is being collected and used to deliver personalised and micro-targeted advertisements, and have little agency and limited ways to stop or control data exploitation. The significant reach of a few online platforms, their access to extensive datasets and participation at multiple levels of the advertising value chain has created challenges for businesses, traditional media services and other market participants seeking to advertise or develop competing advertising services. In addition to the information requirements resulting from Article 6 of Directive 2000/31/EC, stricter rules on targeted advertising and micro-targeting are needed, in favour of less intrusive forms of advertising that do not require extensive tracking of the interaction and behaviour of recipients of the service. Therefore, providers of information society services may only deliver and display online advertising to a recipient or a group of recipients of the service when this is done based on contextual information, such as keywords or metadata. Providers should not deliver and display online advertising to a recipient or a clearly identifiable group of recipients of the service that is based on personal or inferred data relating to the recipients or groups of recipients. Where providers deliver and display advertisement, they should be required to ensure that the recipients of the service have certain individualised information necessary for them to understand why and on whose behalf the advertisement is displayed, including sponsored content and paid promotion.
Amendment 201 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 212 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 221 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. Dispute resolution proceedings should be concluded within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 227 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content in a designated area of expertise, that they represent collective, non-commercial interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 246 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end. Given the significant risks that arise from targeted advertising, including wthen this is based on profiling amplification of illegal or harmful content and other risks associated with the reliance on pervasive tracking and data mining, targeting of advertising based on personal data should be prohibited. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it. In this context, it is important to highlight that consent to targeted advertising should not be considered as freely given, specific and thus valid if access to the service is made conditional on processing of personal data and profiling techniques outside of the control of the user. This Regulation is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 274 #
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including at least one default options that areis not based on profiling of the recipient and alternative, third-party recommender systems where technically possible.
Amendment 290 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65 a) Recipients of a service are often locked in to existing platforms due to network effects, which significantly limits user choice. In order to facilitate free choice of recipients between different services, it is therefore important to consider interoperability for industry- standard features of very large online platforms, such as core messaging functionality or image-sharing services. Such interoperability would empower recipients to choose a service based on its functionality and features such as security, privacy, and data processing standards, rather than its existing user base.
Amendment 292 #
Proposal for a regulation
Recital 66
Recital 66
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate.
Amendment 365 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(i a) Directive 2006/123/EC
Amendment 383 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 402 #
Proposal for a regulation
Article 2 – paragraph 1 – point q
Article 2 – paragraph 1 – point q
(q) ‘terms and conditions’ means all terms and conditions or specifications provided by the provider of intermediary services, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.
Amendment 405 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(q a) ‘manifestly illegal content’ means any information which is unmistakably and without requiring in-depth examination in breach of legal provisions regulating the legality of content online.
Amendment 430 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 435 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, oror temporarily disabling of access to, manifestly illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.
Amendment 436 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Article 6 – paragraph 1 a (new)
Providers of intermediary services shall notify the competent judicial or administrative authority and the recipient of the service concerned about detection and/or disabling of access to manifestly illegal content without undue delay; notified authorities shall authorise the permanent removal of the content notified;
Amendment 437 #
Proposal for a regulation
Article 6 – paragraph 1 b (new)
Article 6 – paragraph 1 b (new)
Voluntary own-initiative investigations shall not lead to ex-ante control measures based on automated content moderation tools.
Amendment 438 #
Proposal for a regulation
Article 6 – paragraph 1 c (new)
Article 6 – paragraph 1 c (new)
Providers of intermediary services shall ensure that such measures are accompanied by appropriate safeguards, such as human oversight, documentation, traceability, transparency of algorithms used or additional measures to ensure the accuracy, fairness, transparency and non- discrimination of voluntary own-initiative investigations.
Amendment 441 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. Providers of intermediary services shall not be obliged to use automated tools for content moderation.
Amendment 452 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
Article 8 – paragraph 2 – point a – indent 1
— a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed with due regard to fundamental rights of the recipient of the service concerned;
Amendment 454 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
Article 8 – paragraph 2 – point a – indent 1 a (new)
- identification of the competent judicial or administrative authority;
Amendment 455 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 b (new)
Article 8 – paragraph 2 – point a – indent 1 b (new)
- reference to the legal basis for the order;
Amendment 457 #
(b) the territorial scope of the order, addressed to a provider that has its main establishment in the Member State issuing the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective;
Amendment 460 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
Article 8 – paragraph 2 – point b a (new)
(b a) the territorial scope of an order addressed to a provider that has its main establishment or legal representation in another Member State or outside the Union, is limited to the territory of the Member State issuing the order, unless the legal basis for the order is directly applicable Union law;
Amendment 461 #
Proposal for a regulation
Recital 52 a (new)
Recital 52 a (new)
(52a) The market position of very large online platforms allows them to collect and combine enormous amounts of personal data, thereby strengthening their market position vis-a-vis smaller competitors, while at the same time incentivising other online platforms to take part in comparable data collection practices and thus creating an unfavourable environment for consumers. Therefore, the collecting and further processing of personal data for the purpose of displaying tailored advertisement should be prohibited. The selection of advertisements shown to a consumer should consequently be based on contextual information, such as language settings by the device of the user or the digital location. Besides a positive effect on privacy and data protection rights of users, the ban will increase competition on the market and will facilitate market access for smaller online platforms and privacy-friendly business models.
Amendment 465 #
Proposal for a regulation
Recital 52 b (new)
Recital 52 b (new)
(52b) The ban on targeted advertising should not hinder contextual advertisement, such as the displaying of a car advertisement on a website presenting information from the automotive sector.
Amendment 496 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, with due regard to fundamental rights of the recipient of the service concerned, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
Amendment 499 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
Article 9 – paragraph 2 – point a – indent 1 a (new)
- identification of the competent judicial or administrative authority;
Amendment 501 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 b (new)
Article 9 – paragraph 2 – point a – indent 1 b (new)
- reference to the legal basis for the order;
Amendment 513 #
Proposal for a regulation
Article 9 – paragraph 4 a (new)
Article 9 – paragraph 4 a (new)
4 a. The obligations under this Article shall not oblige providers of intermediary services to introduce new tracking of profiling techniques for recipients of the service in order to comply with orders to provide information.
Amendment 534 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used by the provider of the intermediary service for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format and include a searchable archive of previous versions of the provider’s terms and conditions.
Amendment 547 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies available.
Amendment 581 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 584 #
Proposal for a regulation
Article 13 b (new)
Article 13 b (new)
Article 13b Online interface design 1. Providers of intermediary services shall refrain from subverting or impairing autonomous decision-making or free choice of a recipient of a service through the design, functioning or operation of online interfaces or a part thereof, such as but not limited to: (a) according visual prominence to one option when asking the recipient of the service for consent or a decision; (b) repeatedly requesting consent to data processing or requesting a change to a setting or configuration of the service after the recipient of the service has already made her choice; (c) making the procedure of cancelling a service more difficult than signing up to it. 2. A choice or decision by the recipient of the service using an online interface that does not comply with the requirements of this article shall not constitute consent in accordance with Regulation (EU) 2016/679. 3. The Commission shall be empowered to publish guidelines indicating specific design choices that qualify as subverting or impairing the autonomy, decision-making processes or choices of the recipient of the service.
Amendment 588 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, clearly visible on the hosting service interface, and allow for the submission of notices exclusively by electronic means and in the language of the individual or entity submitting a notice.
Amendment 589 #
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 593 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
Article 14 – paragraph 2 – point a
(a) an sufficiently substantiated explanation of the reasons why the individual or entity considers the information in question to be illegal content;
Amendment 597 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particularsuch as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 606 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 on the basis of which a diligent economic operator can identify the illegality of the content in question shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
Amendment 639 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15a Content moderation 1. Providers of hosting services shall not use ex-ante control measures for content moderation based on automated tools or ex-ante filtering of content. Where providers of hosting services use automated tools for content moderation, they shall ensure qualified human oversight for any action taken and that legal content which does not infringe the terms and conditions set out by the provider is not affected. This paragraph shall not apply to moderating information which has most likely been provided by automated tools. 2. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory, diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service. Content moderation practices shall be proportionate to the type and volume of content, relevant and limited to what is necessary for the purposes for which the content is moderated. 3. Providers of hosting services shall not subject recipients of the service to discriminatory practices, exploitation or exclusion for the purposes of content moderation, such as removal of user- generated content based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social class.
Amendment 642 #
Proposal for a regulation
Article 15 b (new)
Article 15 b (new)
Article 15b Content moderation staff Providers of hosting services shall ensure adequate qualification of staff working on content moderation, including ongoing training on the applicable legislation and fundamental rights. The provider shall also provide appropriate working conditions including the opportunity to seek professional support, qualified psychological assistance and qualified legal advice.
Amendment 656 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions toagainst or in favour of removeal or disableing of access to the information;
Amendment 657 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions toagainst or in favour of suspendsion or terminateion of the provision of the service, in whole or in part, to the recipients;
Amendment 659 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions toagainst or in favour of suspendsion or terminateion of the recipients’ account.
Amendment 663 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) decisions against or in favour of demonetising content provided by the recipients;
Amendment 665 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
Article 17 – paragraph 1 – point c b (new)
(cb) decisions against or in favour of applying additional labels or information to content provided by the recipients;
Amendment 668 #
Proposal for a regulation
Article 17 – paragraph 1 – point c c (new)
Article 17 – paragraph 1 – point c c (new)
(cc) decisions that adversely affect the recipient’s access to significant features of the platform’s regular services;
Amendment 669 #
Proposal for a regulation
Article 17 – paragraph 1 – point c d (new)
Article 17 – paragraph 1 – point c d (new)
(cd) decisions not to act upon a notice.
Amendment 677 #
Proposal for a regulation
Article 17 – paragraph 5 a (new)
Article 17 – paragraph 5 a (new)
5a. Online platforms shall ensure that any relevant information in relation to decisions taken by the internal complaint- handling mechanism is available to recipients of the service for the purpose of seeking redress through an out-of-court dispute settlement body pursuant to Article 18 or before a court.
Amendment 681 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
The first subparagraph is without prejudice to the right of the recipient concerned to redress against the decision before a court in accordance with the applicable law. Judicial redress against a decision by an out-of-court dispute settlement body shall be directed against the online platform, not the settlement body.
Amendment 684 #
Proposal for a regulation
Article 18 – paragraph 2 – introductory part
Article 18 – paragraph 2 – introductory part
2. The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body, where the body has demonstrated that it meets all of the following conditions:
Amendment 687 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms, including aspects such as financial resources and personnel;
Amendment 696 #
Proposal for a regulation
Article 18 – paragraph 2 – point e
Article 18 – paragraph 2 – point e
(e) the dispute settlement takes place in accordance with clear and fair, fair and publicly available rules of procedure.
Amendment 697 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1
Article 18 – paragraph 2 – subparagraph 1
The Digital Services Coordinator shall, where applicable, specify in the certificate the particular issues to which the body’s expertise relates and the official language or languages of the Union in which the body is capable of settling disputes, as referred to in points (b) and (d) of the first subparagraph, respectively. Certified out-of-court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time.
Amendment 701 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 2
Article 18 – paragraph 3 – subparagraph 2
Certified out-of-court dispute settlement bodies shall make information on the fees, or the mechanisms used to determine the fees, known to the recipient of the services and the online platform concerned before engaging in the dispute settlementpublicly available.
Amendment 703 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6a. Decisions reached by an out-of- court dispute settlement body shall not be disputable by another out-of-court dispute settlement body and the resolution of a particular dispute may only be discussed in one out-of-court dispute settlement body.
Amendment 711 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices on manifestly illegal content submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
Amendment 717 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying manifestly illegal content in a designated area of expertise;
Amendment 722 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective, non- commercial interests and is independent from any online platform;
Amendment 724 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner and in full respect of fundamental rights such as the freedom of expression and information.
Amendment 736 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated noticesor incorrect notices, or notices violating recipients’ fundamental rights, through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint- handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 746 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Amendment 813 #
Proposal for a regulation
Article 23 – paragraph 1 – point a
Article 23 – paragraph 1 – point a
(a) the number of disputes submitted to thecertified out-of-court dispute settlement bodies referred to in Article 18, the outcomes of the dispute settlement and the average time needed for completing the dispute settlement procedures;
Amendment 827 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
Amendment 832 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
2. Online platforms that display advertising on their online interfaces shall include in the reports referred to in Article 13 the following information: (a) the number of advertisements removed, disabled, or labelled by the online platform, accompanied by a justification explaining the grounds for the decision; (b) aggregated data on the provider of the online advertisements that were removed, disabled or labelled by the online platform, including information on the advertisement published, the amount paid for the advertisement and information on the target audience, if applicable.
Amendment 846 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of averagunique monthly active recipients of the service in the Union equal to or higher than 45 million on average, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
Amendment 855 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services and activities, such as business model and design decisions, in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 866 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to, including the respect for private and family life, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 902 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrentall systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 930 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. Any option based on profiling within the meaning of Article 4 (4) of Regulation (EU) 2016/679 shall never be the default setting of a recommender system.
Amendment 934 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
Amendment 942 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2a. Very large online platforms that use recommender systems shall allow the recipient of the service to have information presented to them in a chronological order only and, where technically possible, to use third-party recommender systems. Third-party recommender systems shall have access to the same information available to the recommender systems used by the platform, notwithstanding the platform’s obligations under Regulation (EU) 2016/679. Very large online platforms may only temporarily limit access to third- party recommender systems in case of provable abuse by the third-party provider or when justified by an immediate requirement to address a technical issue such as a serious security vulnerability.
Amendment 948 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces an easily accessible and searchable repository containing the information referred to in paragraph 2, until onfive years after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 951 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
Article 30 – paragraph 2 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
Amendment 956 #
(ea) any decisions by the online platform regarding labelling, removal or disabling of online advertisements, including a justification explaining the grounds for the decision.
Amendment 972 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate in an easily accessible and user-friendly format. This shall include personal data only where it is lawfully accessible by the public and without prejudice to Regulation (EU) 2016/679.
Amendment 975 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests or civil society organisations representing the public interest, be independent from commercial interests, disclose the sources of funding financing their research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 977 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this Regulation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1006 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Amendment 1011 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
2a. Absence of agreement on voluntary industry standards shall not prevent the applicability or implementation of any measures outlined in this regulation.
Amendment 1019 #
Proposal for a regulation
Article 13 b (new)
Article 13 b (new)
Amendment 1051 #
3. Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law or the allocation of additional powers under other applicable law.
Amendment 1069 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Article 43 – paragraph 1 a (new)
Reporting persons within the meaning of Article 4 of Directive (EU) 2019/1937 shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the reporting person resides. Such complaints shall be treated with priority by the Digital Services Coordinator and shall, where appropriate, be transmitted to the Digital Service Coordinator of the establishment of the provider of the intermediary service concerned.
Amendment 1130 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15a Providers of hosting services shall not use ex-ante control measures based on automated tools or upload-filtering of content for content moderation. Where providers of hosting services use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and on going training on the applicable legislation and international human rights standards as well as appropriate working conditions are provided to staff. This paragraph shall not apply to moderating information which has most likely been provided by automated tools.