Activities of Nicola BEER related to 2020/0361(COD)
Shadow opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Amendments (77)
Amendment 87 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, search engines, livestreaming platforms, messaging services or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
Amendment 90 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a large or potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admission to a user group, such information should only be considered to be publicly available when users seeking to access such information are automatically registered or admitted without human intervention to decide or select the users to whom access is granted. The mere possibility to create groups of users of a given service, including a messaging service, should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finlimited number of pre- determined persons, taking into account the potential for groups to become tools for wide dissemination of content to the public. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation if they do not meet the above criteria for "dissemination to the public" . Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. File-sharing services and other cloud services fall within the scope of this Regulation, to the extent that such services are used to make the stored information available to the public at the direct request of the content provider. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 122 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law, national law, or international law and the interests of international comity.
Amendment 123 #
Proposal for a regulation
Recital 33
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information under either Union or national law, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
Amendment 145 #
Proposal for a regulation
Recital 7
Recital 7
(7) In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to providers of intermediary services irrespective of their place of establishment or residence, in so far as they provide servicesand direct services at recipients which are physical persons residing in the Union or physical persons acting on behalf of a legal person in the Union, as evidenced by a substantial connection to the Union.
Amendment 158 #
Proposal for a regulation
Recital 49
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with tradermarketplaces should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platformprovider of the online marketplace, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
Amendment 160 #
Proposal for a regulation
Recital 50
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platformproviders of online marketplaces covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platformproviders of online marketplaces covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platformproviders, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platformProviders of online marketplaces should also design and organise their online interface in a user- friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . The online interface shall allow traders to provide the information allowing for the unequivocal identification of the product or the service, including labelling requirements, in compliance with legislation on product safety and product compliance. _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
Amendment 160 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “"illegal content”" should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable lawUnion or national law as a result of its display on an intermediary service is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegaldue to its direct connection to or its promotion of an illegal activity, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non- compliant or counterfeit products, illegal trading of animals, plants and substances, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 163 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
(50 a) Providers of online marketplaces should demonstrate their best efforts in preventing the dissemination by traders of illegal products and services. In compliance with the no general monitoring provision, providers shall inform recipients when the service or product they have acquires through their services is illegal. Once notified of an illegal product as foreseen in Article 14, providers of online marketplaces should take measures to prevent such notified products and services from being reuploaded on their marketplace.
Amendment 166 #
Proposal for a regulation
Recital 12 a (new)
Recital 12 a (new)
(12a) Material disseminated for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering illegal content, including the content which represents an expression of polemic or controversial views in the course of public debate, should not be considered as illegal content. Similarly, material, such as an eye-witness video of a potential crime, should not be considered as illegal, merely because its depicts an illegal act. An assessment shall determine the true purpose of that dissemination and whether material is disseminated to the public for those purposes.
Amendment 188 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Nothing in this Regulation should prevent providers from enacting end-to-end encryption of their services.
Amendment 189 #
Proposal for a regulation
Recital 63 a (new)
Recital 63 a (new)
(63 a) By associating advertisement with content uploaded by users, very large online platform could indirectly lead to the promotion of illegal content, or content that is in breach of their terms and condition and could risk to considerably damage the brand image of the buyers of advertising space. In order to prevent such practice, the very large online platforms should ensure, including through standard contractual guarantees to the purchasers of advertising space, that the content to which they associate advertisements is legal, and compliant with their terms and conditions. Furthermore, the very large online platforms should allow advertisers to have access to the results of audits carried out independently and evaluating platforms’ commitments and tools for brand safety.
Amendment 201 #
Proposal for a regulation
Recital 36
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. This contact point may be the same contact point as required under other Union acts. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
Amendment 204 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. Where providers of intermediary services that are established in a third country choose not to do so, they become subject to the jurisdiction of all Member States, in accordance with Article 40(3).
Amendment 236 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
Article 2 – paragraph 1 – point i a (new)
(i a) 'online marketplace' means an online platform that allows consumers to conclude distance contracts with other traders or consumers
Amendment 241 #
Proposal for a regulation
Recital 54
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The determination of this operational threshold, therefore, should only take into account those recipients which are physical persons residing in the Union or physical persons acting on behalf of a legal person established in the Union. Automated bots, fake accounts, indirect hyperlinking, FTP or other indirect downloading of content should not be included in the determination of this threshold being exceed. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
Amendment 262 #
Proposal for a regulation
Recital 63
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. In addition, very large online platforms should label any known deep fake videos, audio or other files.
Amendment 286 #
Proposal for a regulation
Article 12 – paragraph -1 (new)
Article 12 – paragraph -1 (new)
-1. Providers of intermediary services shall ensure that their terms and conditions prevent the recipients of their services from providing information that is not compliant with Union law or the law of the Member State where the information is provided. Any additional restrictions that providers of intermediary services may impose in relation to the use of their service and the information provided by the recipients of the service shall be in full compliance with the fundamental rights of the recipients of the services as enshrined in the Charter.
Amendment 300 #
Proposal for a regulation
Recital 99
Recital 99
(99) In particular, the Commission, where it can show grounds for believing that a very large online platform is not compliant with this Regulation, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information related to those concerns. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers.
Amendment 301 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
Amendment 302 #
Proposal for a regulation
Recital 105 a (new)
Recital 105 a (new)
(105a) This Regulation serves a horizontal framework to ensure the further strengthening and deepening the Digital Single Market and the internal market and therefore seeks to lay down rules and obligations which, unless specified, seek to be applicable to all providers without regards to individual models of operation.
Amendment 303 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down harmonised rules on the provision of intermediary services in the internal marketorder to improve the functioning of the internal market whilst ensuring the rights enshrined in the Charter of Fundamental Rights of the European Union, in particular the freedom of expression and information in an open and democratic society. In particular, it establishes:
Amendment 307 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniform harmonised rules for a safe, predictable, accessible and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 309 #
Proposal for a regulation
Article 1 – paragraph 3
Article 1 – paragraph 3
3. This Regulation shall apply to intermediary services directed at and provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services.
Amendment 310 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
Article 1 – paragraph 4 a (new)
4a. This Regulation shall respect the fundamental rights recognised by the Charter of Fundamental rights of the European Union and the fundamental rights constituting general principles of Union law. Accordingly, this Regulation may only be interpreted and applied in accordance with those fundamental rights, including the freedom of expression and information, as well as the freedom and pluralism of the media. When exercising the powers set out in this Regulation, all public authorities involved shall aim to achieve, in situations where the relevant fundamental rights conflict, a fair balance between the rights concerned, in accordance with the principle of proportionality.
Amendment 314 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
Article 13 – paragraph 1 a (new)
1 a. The information provided shall be broken down per Member State in which services are offered and in the Union as a whole.
Amendment 316 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraphs 1 and 1a shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 316 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
Article 1 – paragraph 5 a (new)
5a. The Commission shall by [within one year of the adoption of this Regulation] publish guidelines with regards to the relations between this Regulation and those legislative acts listed in Article 1(5). These guidelines shall clarify any potential conflicts between the conditions and obligations enlisted in these legislative acts and which act prevails where actions, in line with this Regulation, fulfil the obligations of another legislative act and which regulatory authority is competent.
Amendment 322 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
Article 2 – paragraph 1 – point d – indent 1
— a significant number of useraverage monthly active recipients in one or more Member States; or
Amendment 324 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself, or by its reference to andue to its connection to or promotion of an illegal activity, including the sale of products, substances, animals or plants, or provision of services, is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;, directly leads to the dissemination to the public of such an illegal content. Material disseminated for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering illegal content including the content which represents an expression of polemic or controversial views in the course of public debate shall not be considered as illegal content. An assessment shall determine the true purpose of that dissemination and whether material is disseminated to the public for those purposes.
Amendment 334 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities, either through automated or manual means, undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, monetisation and accessibility of that illegal content or that information, such as demotion, disabling of access to, delisting, demonetisation or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
Amendment 336 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(qa) ‘deep fake’ means a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful;
Amendment 350 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. This Regulation shall not prevent providers from offering end- to-end encrypted services. The provision of such services shall not constitute a reason for liability or for becoming ineligible for the exemptions from liability.
Amendment 363 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
Article 10 – paragraph 2 a (new)
2a. Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision.
Amendment 365 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shallmay designate, in writing, a legal or natural person to act as their legal representative in one of the Member States where the provider offers its services.
Amendment 366 #
Proposal for a regulation
Article 11 – paragraph 1 – subparagraph 1 (new)
Article 11 – paragraph 1 – subparagraph 1 (new)
Where a provider of intermediary services chooses not to designate a legal representative, Article 40(3) shall apply.
Amendment 367 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
Article 11 – paragraph 5 a (new)
5a. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, other than those which are either a very larger online platform or a marketplace.
Amendment 382 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or non-governmental entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means. and may include: (a) a clearly identifiable banner or single reporting button, allowing the users of those services to notify quickly and easily the providers of hosting services; (b) providing information to the users on what is considered illegal content under Union and national law; (c) providing information to the users on available national public tools to signal illegal content to the competent authorities in Member States where the service is directed.
Amendment 412 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
Article 19 – paragraph 4 a (new)
4 a. Member States can acknowledge trusted flaggers recognized in another Member State as a trusted flagger on their own territory. Trusted flaggers can be awarded the status of European trusted flagger;
Amendment 415 #
Proposal for a regulation
Article 19 – paragraph 7
Article 19 – paragraph 7
7. The Commission, after consulting the Board, mayshall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 5 and 6.
Amendment 436 #
Proposal for a regulation
Article 18 – paragraph 6
Article 18 – paragraph 6
6. This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directiveshall only take effect on providers other than very large online platforms from [24 months after the date of entry into force of this Regulation].
Amendment 439 #
Proposal for a regulation
Article 22 – title
Article 22 – title
22 Traceability of traders and online advertisers
Amendment 444 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders or sells online advertisements, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
Amendment 445 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, itProviders of online marketplaces shall ensure that traders can only use itstheir services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of itstheir services, the online platformmarketplaces hasve obtained the following information:
Amendment 450 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and where proportionate after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
Amendment 454 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
Article 20 – paragraph 3 a (new)
3a. Suspensions referred to in paragraphs 1 and 2 may be declared permanent where: (a) compelling reasons of law or public policy, including ongoing criminal investigations, justify avoiding or postponing notice to the recipient; (b) the items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts; or (c) the items removed were related to content covered by [Directive 2011/93/EU updated reference] or [Directive (EU) 2017/541 XXX New Ref to TCO Regulation].
Amendment 458 #
Proposal for a regulation
Article 21 – paragraph 1
Article 21 – paragraph 1
1. Where an online platform becomes aware of anyexact information giving rise to a suspicion that a serious criminal offence involving an imminent threat to the life or safety of persons has taken place, is taking place or is likelyplanned to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide al, upon their request, any additional relevant information available.
Amendment 459 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. The online platformprovider of the online marketplace shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources.
Amendment 461 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
Article 21 – paragraph 2 a (new)
2a. Information obtained by a law enforcement or judicial authority of a Member State in accordance with Paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified.
Amendment 464 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Article 22 – paragraph 3 – subparagraph 1
3. Where the online platformprovider of the online marketplace obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platformmarketplace shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
Amendment 467 #
Proposal for a regulation
Article 22 – paragraph 4
Article 22 – paragraph 4
4. The online platformprovider of the online marketplace shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information.
Amendment 470 #
Proposal for a regulation
Article 22 – paragraph 5
Article 22 – paragraph 5
5. Without prejudice to paragraph 2, the platformrovider of the online marketplace shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation.
Amendment 471 #
Proposal for a regulation
Article 22 – paragraph 6
Article 22 – paragraph 6
6. The online platformprovider of the online marketplace shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner.
Amendment 473 #
Proposal for a regulation
Article 22 – paragraph 7
Article 22 – paragraph 7
7. The online platformprovider of the online marketplace shall design and organise its online interface in a fair and user-friendly way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law.
Amendment 475 #
Proposal for a regulation
Article 22 – paragraph 7 a (new)
Article 22 – paragraph 7 a (new)
7 a. The online interface shall allow traders to provide the information allowing for the unequivocal identification of the product or the service, and, where applicable, the information concerning the labelling, including CE marking, which are mandatory under applicable legislation on product safety and product compliance.
Amendment 477 #
Proposal for a regulation
Article 22 a (new)
Article 22 a (new)
Article 22 a Additional provisions for online marketplaces related to illegal offers 1. The provider of the online marketplace shall take adequate measures in order to prevent the dissemination by traders using its service of offers for a product or a service, which do not comply with Union law. 2. Where the provider of the online marketplace obtains indication including the elements listed in points (a) and (b) of paragraph 2 of Article 14, and according to which an item of information referred to in Article 22 is inaccurate, that online marketplace service provider shall request the trader to give evidence of the accuracy of that item of information or to correct it, without delay. 3. Before the trader's offer is made available on the online marketplace, the provider of the online marketplace shall verify, with regard to the information referred to in paragraph 8 of Article 22, if the offer that the trader wishes to propose to consumers located in the Union is mentioned in the list, or the lists, of products or categories of products identifies as non compliant, as classified in any freely accessible official online database or online interface, and shall not authorise the trade to provide the offer if that the product is on such list. 4. Where a provider of the online marketplace becomes aware of the illegal nature of a product or service offered through its services, it shall inform those recipients of the service that had acquired such product or contracted such service. 5. The provider of the online marketplace shall demonstrate its best effort to put in place proportionate mechanisms to prevent offers for products that were previously notified in accordance with Article 14 on as counterfeiting from reappearing on the platform. Such mechanisms should not lead to general monitoring in conformity with Article 7 . 6. The provider of the online marketplace shall suspend without undue delay the provision of its services to traders that provide in a repeated manner or illegal offers for a product or a service. It shall immediately notify its decision to the trader.
Amendment 482 #
Proposal for a regulation
Article 23 – paragraph 4
Article 23 – paragraph 4
4. The Commission mayshall adopt implementing acts to establish a set of Key Performance Indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
Amendment 486 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
Without prejudice to other Union acts, online platforms that display user- generated content that may include sponsored information or other information equivalent to advertising, which is normally provided against remuneration, shall include in their terms and conditions an obligation for the recipients of their service to inform other recipients of when they have received remuneration or any other goods in kind for their content. A failure to inform the platform or other recipients shall be deemed as a violation of the provider’s terms and conditions.
Amendment 493 #
Proposal for a regulation
Article 25 – paragraph 3 – subparagraph 1 (new)
Article 25 – paragraph 3 – subparagraph 1 (new)
Such a methodology shall ensure the following in relations to active recipients: (a) Automated interactions, accounts or data scans by a non-human ("bots") are not included; (b) That the mere viewing of a service without purchase, logging in or otherwise active identification of a recipient shall not be seen as an active recipient; (c) That the number shall be based on each service individually; (d) That recipients connected on multiple devices are counted only once; (e) That indirect use of service, via a third party or linking, shall not be counted; (f) Where an online platform is hosted by another provider of intermediary services, that the active recipients are assigned solely to the online platform closest to the recipient; (g) The average number is maintained for a period of at least six months.
Amendment 498 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
Online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request information where their advertisements have been placed; (b) can request information on which broker treated their data; (c) can indicate on which specific websites their ads cannot be placed. In case of non-compliance with this provision, advertisers should have an option to judicial redress.
Amendment 498 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 525 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. The risk assessment shall be broken down per Member State in which services are offered and in the Union as a whole. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 535 #
Proposal for a regulation
Article 30 – title
Article 30 – title
Additional transparency for online advertising transparencyand "deep fakes" audiovisual media
Amendment 540 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use, deep fakes or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 541 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. Where a very large online platform becomes aware that a piece of content is a deep fake, the provider shall label the content in a way that informs that the content is inauthentic and that is clearly visible for the recipient of the services.
Amendment 542 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
Article 30 – paragraph 2 b (new)
2b. Very large online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request and obtain information on where their advertisements have been placed; (b) can request and obtain information on which broker treated their data;
Amendment 554 #
Proposal for a regulation
Article 36 – paragraph 3 a (new)
Article 36 – paragraph 3 a (new)
3a. The Commission shall encourage all the actors in the online advertising eco-system to endorse and comply with the commitments stated in the codes of conduct.
Amendment 563 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which. The reports of the Board shall be broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. The reports shall include the following:
Amendment 571 #
Proposal for a regulation
Article 42 a (new)
Article 42 a (new)
Article 42 a General conditions for imposing penalties 1. Before penalties are issued under Article 42, when deciding whether to impose a penalty and deciding on the amount of the penalty in each individual case due regard shall be given to the following: (a) the nature, gravity and duration of the infringement taking into account the nature scope or purpose of the processing concerned as well as the number of recipients affected and the level of damage suffered by them; (b) the intentional or negligent character of the infringement; (c) any action taken by the provider to mitigate the damage of the infringement; (d) the degree of responsibility of the provider taking into account any other providers involved; (e) any relevant previous infringements by the provider; (f) the degree of cooperation with the Digital Services Coordinator(s),in order to remedy the infringement and mitigate the possible adverse effects of the infringement; (g) the manner in which the infringement became known to the Member State; (h) where infringement have previously been ordered against the provider concerned with regard to the same subject-matter, compliance with those measures; (i) adherence to approved codes of conduct pursuant to Articles 35 and 36; and (j) any other aggravating or mitigating factor applicable to the circumstances of the case, such as financial benefits gained, or losses avoided, directly or indirectly, from the infringement. 2. If a provider infringes several provisions of this Regulation, the total amount of the penalty shall not exceed the amount specified in Article 42 (3). 3. The exercise by a Member State of its powers under this Article and Article 42 shall be subject to appropriate procedural safeguards in accordance with Union and Member State law, including effective judicial remedy and due process.
Amendment 593 #
Proposal for a regulation
Article 51 a (new)
Article 51 a (new)
Article 51 a Requirements for the Commission 1. The Commission shall perform its tasks under this Regulation in an impartial, transparent and timely manner. The Commission shall ensure that its units given responsibility for this Regulation have the adequate technical, financial and human resources to carry out their tasks. 2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Commission shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party.
Amendment 610 #
2 a. Online platforms shall ensure that their online interface is designed in such a way that it does not risk misleading or manipulating the recipients of the service.
Amendment 617 #
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
Article 30 – paragraph 2 – point b a (new)
(b a) the natural or legal person who paid for the advertisement;
Amendment 622 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
Amendment 623 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2 a. The Board shall, together with trusted flaggers and vetted researchers, publish guidelines on the way add libraries should be organized.
Amendment 624 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
Article 30 – paragraph 2 b (new)
2 b. Very large online platforms that display advertising on their online interfaces shall conduct at their own expense, and upon request of advertisers , independent audits performed by organisations complying with the criteria set out in Article 28(2). Such audits shall be based on fair and proportionate conditions agreed between platforms and advertisers, shall be conducted with a reasonable frequency and shall entail: (a) conducting quantitative and qualitative assessment of cases where advertising is associated with illegal content or with content incompatible with platforms’ terms and conditions; (b) monitoring for and detecting of fraudulent use of their services to fund illegal activities; (c) assessing the performance of their tools in terms of brand safety. The audit report shall include an opinion on the performance of platforms’ tools in terms of brand safety. Where the audit opinion is not positive, the report shall make operational recommendations to the platforms on specific measures in order to achieve compliance. The platforms shall make available to advertisers, upon request, the results of such audit.
Amendment 625 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
Article 30 – paragraph 2 b (new)
2 b. Very large online platforms shall label inauthentic video’s (‘deep fakes’) as inauthentic in a way that is clearly visible for the internet user.
Amendment 642 #
Proposal for a regulation
Article 33 – paragraph 2 b (new)
Article 33 – paragraph 2 b (new)
2 b. The reports shall be published in the official languages of the Member States of the Union.