BETA

Activities of Marisa MATIAS related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
2021/09/28
Committee: ITRE
Dossiers: 2020/0361(COD)
Documents: PDF(418 KB) DOC(266 KB)
Authors: [{'name': 'Henna VIRKKUNEN', 'mepid': 124726}]

Amendments (87)

Amendment 73 #
Proposal for a regulation
Recital 2
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross- border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. , without lock-in effects. Or. en Justification
2021/06/23
Committee: ITRE
Amendment 76 #
Proposal for a regulation
Recital 4 a (new)
(4 a) Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility. Therefore, accessibility requirements for intermediary services, including their user interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as a result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021- 2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals. Accessibility for persons with disabilities means that services, technologies and products are perceivable, operable, understandable and robust for persons with disabilities.
2021/06/23
Committee: ITRE
Amendment 79 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national leveIn the event of a conflict between lex specialis Directives and their implementing national measures and the present Regulation, the lex specialis provisions shall prevail. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/06/23
Committee: ITRE
Amendment 80 #
Proposal for a regulation
Recital 11
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights as implemented in national laws so as to insure the highest level of protection of these rights, which establish specific rules and procedures that should remain unaffected.
2021/06/23
Committee: ITRE
Amendment 83 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and alsorefer back to EU and national legal definitions as well as covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/06/23
Committee: ITRE
Amendment 86 #
Proposal for a regulation
Recital 12 a (new)
(12 a) More than 80% of illegal content is removed solely on the basis of the automated mechanisms put in place by the platforms. This content, which is potentially the most toxic, is therefore never brought to the attention of law enforcement authorities and thus escapes any judicial sanction. This situation should therefore be remedied as it sets a dangerous precedent where big platforms decide on what constitute an illegal content.
2021/06/23
Committee: ITRE
Amendment 124 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to reinforce and guarantee different public policy objectivelegislation and rights such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/06/23
Committee: ITRE
Amendment 146 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collectivepublic interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/06/23
Committee: ITRE
Amendment 164 #
Proposal for a regulation
Recital 50 a (new)
(50 a) Without prejudice to Article 16, to strengthen the obligations of online marketplaces, further ex-ante provisions must be put in place, so as to ensure ex ante that consumers have the necessary information for product offers, prevent unsafe and non-compliant products and product categories, strengthen ex-ante actions against product counterfeiting as well as to cooperate (ex post) where necessary with regard to dangerous products already sold.
2021/06/23
Committee: ITRE
Amendment 169 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on over- consumption and climate change as well as the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. Lastly, basing advertisements on user behaviour should be prohibited. Personalised advertisements can be based on the content the user is viewing only and tracking the user beyond the platform itself, on the wider web, is forbidden. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/06/23
Committee: ITRE
Amendment 177 #
Proposal for a regulation
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures. to redress in particular filtering bubbles and filtering effects.
2021/06/23
Committee: ITRE
Amendment 196 #
Proposal for a regulation
Recital 73 a (new)
(73 a) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the country-of-origin principle must be adjusted, so as to share the burden and avoid a risk for some authorities to be unable to carry out their tasks. Such a cooperation would allow proper account of national specificities relating to the regulation of content, while bearing in mind the respect for EU legislation, namely with regard to fundamental rights and the rule of law. Prerogatives of intervention in favour of the competent authorities of the country of destination.
2021/06/23
Committee: ITRE
Amendment 205 #
(95) In order to address those public policyfundamental rights concerns it is therefore necessary to provide for a common system of enhanced supervision and enforcement at Union level. Once an infringement of one of the provisions that solely apply to very large online platforms has been identified, for instance pursuant to individual or joint investigations, auditing or complaints, the Digital Services Coordinator of establishment, upon its own initiative or upon the Board’s advice, should monitor any subsequent measure taken by the very large online platform concerned as set out in its action plan. That Digital Services Coordinator should be able to ask, where appropriate, for an additional, specific audit to be carried out, on a voluntary basis, to establish whether those measures are sufficient to address the infringement. At the end of that procedure, it should inform the Board, the Commission and the platform concerned of its views on whether or not that platform addressed the infringement, specifying in particular the relevant conduct and its assessment of any measures taken. The Digital Services Coordinator should perform its role under this common system in a timely manner and taking utmost account of any opinions and other advice of the Board.
2021/06/23
Committee: ITRE
Amendment 211 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/06/23
Committee: ITRE
Amendment 215 #
Proposal for a regulation
Article 1 – paragraph 5 – point b
(b) Directive 2010/13/EC(EU) 2019/882;
2021/06/23
Committee: ITRE
Amendment 216 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
(c) Union law on copyright and related rights as implemented in national laws so as to insure the highest level of protection of these rights;
2021/06/23
Committee: ITRE
Amendment 238 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against indirect and direct forms of remuneration specifically for promoting that information;
2021/06/23
Committee: ITRE
Amendment 239 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed as well as ranking and prioritisation techniques;
2021/06/23
Committee: ITRE
Amendment 243 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(q a) ‘online marketplace’ means a service using software, including a website, part of a website or an application, operated by or on behalf of a trader which allows consumers to conclude distance contracts with other traders or consumers, according to Directive (EU) 2019/2161;
2021/06/23
Committee: ITRE
Amendment 259 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to concOnline marketplaces and traders can be jointly liable for: - non-compliance of their due diligence obligations, - damages, when failing to act upon obtaining credible evidence of illegal activities, without incurring into a general duty to monitor the activity of platform users, - damages, contract performance and guarantees : 1- for failudre distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a wayto inform consumers about the supplier of the goods or services, in line with Article 4.5 of the Omnibus Directive introducing the new Art. 6a.1,b) of the Consumer Rights Directive and CJEU Wathelet case C-149/15; 2- for providing misleading information, guarantees, or statements; where the platform has a predominant influence over suppliers or the transaction. Such predominant influence or control can be inferred by non-exhaustive and non-cumulative criteria that would leabe assessed aon average and reasonably well-infor case-by-case basis by courts such as : a) The supplier-customedr consumer to believe that the information,tract is concluded exclusively through facilities provided on the platform; b) The platform operator withholds the identity orf the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipientsupplier or contact details until after the conclusion of the supplier- customer contract; c) The platform operator exclusively uses payment systems which enable the platform operator to withhold payments made by the customer to the supplier; d) The terms of the supplier-customer contract are essentially determined by the platform operator; e) The price to be paid by the customer is set by the platform operator; f) The marketing is focused ofn the service who is acting under its authority or control. platform operator and not on suppliers; or g) The platform operator promises to monitor the conduct of suppliers and to enforce compliance with its standards beyond what is required by law.
2021/06/23
Committee: ITRE
Amendment 271 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Online marketplaces should conduct periodic checks on trader accounts and the products and services they facilitate offering.
2021/06/23
Committee: ITRE
Amendment 278 #
Proposal for a regulation
Chapter III – title
Due diligence obligations for a transparent, accessible and safe online environment
2021/06/24
Committee: ITRE
Amendment 279 #
Proposal for a regulation
Article 10 – paragraph 2
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact. Providers of intermediary services shall at least communicate to the Commission and the Board the contact details of their single point of contact. They shall ensure that that information is up to date.
2021/06/24
Committee: ITRE
Amendment 281 #
Article 10 a Accessibility requirements for intermediary services 1. Providers of intermediary services which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882. 2. Providers of intermediary services shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in written and oral format, including in a manner which is accessible to persons with disabilities. Intermediary service providers shall keep that information for as long as the service is in operation. 3. Providers of intermediary services shall ensure that information, forms and measures provided pursuant to Articles 10 new (9), 12(1), 13(1), 14(1) and (5), 15(3) and (4), 17(1), (2) and (4), 23(2), 24, 29(1) and (2), 30(1), and 33(1) are made available in a manner that they are easy to find, accessible to persons with disabilities, and do not exceed a level of complexity superior to level B1 (intermediate) of the Council of Europe’s Common European Framework of Reference for Languages. 4. Providers of intermediary services which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services. 5. In the case of non-conformity, providers of intermediary services shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. Furthermore, where the service is not compliant with applicable accessibility requirements, the provider of the intermediary service shall immediately inform the Digital Services Coordinator of establishment or other competent national authority of the Member States in which the service is established, to that effect, giving details, in particular, of the non- compliance and of any corrective measures taken. 6. Provider of intermediary services shall, further to a reasoned request from a competent authority, provide it with all information necessary to demonstrate the conformity of the service with the applicable accessibility requirements. They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements. 7. Intermediary services which are in conformity with harmonised standards or parts thereof the references of which have been published in the Official Journal of the European Union, shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those standards or parts thereof cover those requirements. 8. Intermediary services which are in conformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements. 9. All intermediary services shall, at least once a year, report to Digital Service Coordinators or other competent authorities on their obligation to ensure accessibility for persons with disabilities as required by this Regulation. 10. In addition to Article 44 (2), Digital Services Coordinators shall include measures taken pursuant to this article.
2021/06/24
Committee: ITRE
Amendment 304 #
Proposal for a regulation
Article 13 – paragraph 1 – point d a (new)
(d a) the total number and frequency of user-reported breaches of platform standards on disinformation;
2021/06/24
Committee: ITRE
Amendment 318 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means. Those mechanisms should never replace any decision of independent judicial and administrative authorities on whether a content is illegal or not.
2021/06/24
Committee: ITRE
Amendment 327 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs where possible, and, where necessary, additional information enabling the identification of the illegal content;
2021/06/24
Committee: ITRE
Amendment 334 #
Proposal for a regulation
Article 14 – paragraph 4
4. WThere the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly send a confirmation of receipt of the notice to that individual or entity provider of hosting services shall send the notification to the content provider, informing them that a complaint has been made against their content. The notification should be delivered to content providers before any action is taken by the provider of hosting services. The provider of hosting services shall either forward lawful and compliant notification to the content provider, or by notifying the complainant of the reason it is not possible to do so.
2021/06/24
Committee: ITRE
Amendment 341 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect ofremove or disable access to the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
2021/06/24
Committee: ITRE
Amendment 347 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services decides or not to remove or disable access to, or otherwise moderate either the form or distribution of specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, or other content moderation and content curation measure, of the decision and provide a clear and specific statement of reasons for that decision.
2021/06/24
Committee: ITRE
Amendment 351 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access;
2021/06/24
Committee: ITRE
Amendment 354 #
Proposal for a regulation
Article 15 – paragraph 2 – point d
(d) where the decision concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground and how long to expect the national authority decision for;
2021/06/24
Committee: ITRE
Amendment 356 #
Proposal for a regulation
Article 15 – paragraph 2 – point d a (new)
(d a) Where the decision is based on an assessment of a risk under the risk assessment procedure of Article 26 of this Act, a reference to the risk identified and explanations as to why any mitigation measure applied under Article 27 of this act was considered to be required to mitigate that risk.
2021/06/24
Committee: ITRE
Amendment 368 #
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. , with the exception of Articles 17 (internal complaint mechanism), 18 (out-of-court dispute settlement), 22 (traceability of traders), 24 (online advertising) and 29 (recommender systems).
2021/06/24
Committee: ITRE
Amendment 377 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, for a period of at least sixtwelves months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/06/24
Committee: ITRE
Amendment 387 #
Proposal for a regulation
Article 17 – paragraph 4
4. Online platforms shall inform complainants without undue delay, that can not exceed three weeks following the filed complaint, of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities.
2021/06/24
Committee: ITRE
Amendment 406 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interestspublic interests, has no commercial interest in removing the content and is independent from any online platform or subcontractors;
2021/06/24
Committee: ITRE
Amendment 441 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has, in the case of third-country traders, verified that the latter has a European branch or representative, and has obtained the following information :
2021/06/24
Committee: ITRE
Amendment 456 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. Online marketplaces should conduct periodic checks on trader accounts and the products and services they facilitate offering in view of respecting Article 7.
2021/06/24
Committee: ITRE
Amendment 461 #
Proposal for a regulation
Article 22 – paragraph 2 a (new)
2 a. Platforms conduct regular and diligent checks on traders’ legitimacy and the information they provide as soon as they receive it. The online platform shall : a) prevent the online offer of unsafe and non-compliant products and product categories, namely by checking in official online databases in compliance with the recommendation of the European Product Safety Pledge. b) prevent the online offer of counterfeit products. c) cooperate ex post where necessary with regard to dangerous products already sold, namely with the competent authorities and take action to warn consumers.
2021/06/24
Committee: ITRE
Amendment 483 #
Proposal for a regulation
Article 24 – title
Online advertising transparency requirements
2021/06/24
Committee: ITRE
Amendment 484 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, meaningful and unambiguous manner and in real time:
2021/06/24
Committee: ITRE
Amendment 486 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
(a) that the information displayed is an advertisement and whether the advertisement has been optimised by an algorithm, including disclosure of all proxies that were used for optimization of an advertisement;
2021/06/24
Committee: ITRE
Amendment 488 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and whether the advertisement was selected using an automated system and in that case, the identity of the natural or legal person(s) responsible for the system(s);
2021/06/24
Committee: ITRE
Amendment 492 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c) meaningful information about the mainall parameters used to determine and target the recipient to whom the advertisement is displayed., including the category and source of personal data uploaded to the online platform and the legal basis for uploading this personal data pursuant to Regulation (EU) 2016/679;
2021/06/24
Committee: ITRE
Amendment 495 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
(c a) meaningful information about algorithms to optimise the advertisement, if any, including meaningful explanation of optimisation goal, proxy attributes used for its optimisation as well as meaningful explanation of reasons why online platform optimised and displayed the advertisement to this recipient in order to achieve its optimisation goal;
2021/06/24
Committee: ITRE
Amendment 497 #
Proposal for a regulation
Article 24 – paragraph 1 – point c b (new)
(c b) Recipients of the service shall have access to profiling data that online platforms hold about them. This data should be made available to recipients of the service in a comprehensible format and should also include inferences made about that recipient, pursuant to Regulation (EU) 2016/679. Recipients of the service shall also be able to rectify and delete their profile, including information inferred about them by the platform. Such profiles must not be used for advertising.
2021/06/24
Committee: ITRE
Amendment 505 #
Proposal for a regulation
Article 24 a (new)
Article 24 a Restrictions on targeted advertisements Distributing advertisements on the basis of targeting recipients of the service on the basis of their behaviour data or using profiling techniques shall be prohibited. Personalised advertisements may be based on the content the recipient is viewing on the online platform only, with due information. Tracking the user beyond the platform itself, through other services or on the wider web, shall be prohibited. Online platforms shall not be allowed to resort to cross-device and cross-site combination of data processed inside or outside the platform.
2021/06/24
Committee: ITRE
Amendment 508 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 422,5 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
2021/06/24
Committee: ITRE
Amendment 509 #
Proposal for a regulation
Article 25 – paragraph 2
2. The Commission shall adopt delegated acts in accordance with Article 69 to adjust the number of average monthly recipients of the service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least with 5 % in relation to its population in 2020 or, after adjustment by means of a delegated act, of its population in the year in which the latest delegated act was adopted. In that case, it shall adjust the number so that it corresponds to 105% of the Union’s population in the year in which it adopts the delegated act, rounded up or down to allow the number to be expressed in millions.
2021/06/24
Committee: ITRE
Amendment 513 #
Proposal for a regulation
Article 25 – paragraph 4 – subparagraph 2
The Commission shall ensure that the list of designated very large online platforms is published in the Official Journal of the European Union and keep that list updated. The obligations of this Section shall apply, or cease to apply, to the very large online platforms concerned from fourtwo months after that publication.
2021/06/24
Committee: ITRE
Amendment 524 #
Proposal for a regulation
Article 26 – title
RiskEx ante human rights impact assessment
2021/06/24
Committee: ITRE
Amendment 527 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This riskex ante human rights impact assessment shall be specific to their services and shall include the following systemic risks:
2021/06/24
Committee: ITRE
Amendment 531 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively as well as any other human right and freedom enshrined in the Charter that can be negatively affected by these systems now or in the future;
2021/06/24
Committee: ITRE
Amendment 538 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable systemic negative effects on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security, including but not restricted to the risk of the intentional manipulation of their service by means of inauthentic use or automated exploitation of the service.
2021/06/24
Committee: ITRE
Amendment 545 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting riskex ante mandatory human rights impact assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/06/24
Committee: ITRE
Amendment 548 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
2 a. The risk assessments must be designed with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations.
2021/06/24
Committee: ITRE
Amendment 549 #
Proposal for a regulation
Article 26 – paragraph 2 b (new)
2 b. A summary or a redacted version of these mandatory ex ante fundamental rights impact assessments should be made publicly available and accessible. All information needs to be communicated to the Digital Service Coordinators and national authorities with relevant expertise. The summary shall be made equally accessible to any not-for-profit body,organisation or association which has been properly constituted in accordance with the law of a Member State, has statutory objectives which are in the public interest,and is active in the field of the protection of consumer rights or fundamental rights and freedoms with regard to the protection of their personal data, freedom of expression and opinion, access to information, consumer protection, right to equal treatment and prohibition of discrimination, for the purposes of independent audits as referred to in Article 28 of this Regulation. They shall be vetted by the independent enforcement and monitoring unit of the Commission and the list of vetted subjects should be administered and made publicly available by the Board.
2021/06/24
Committee: ITRE
Amendment 553 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions including but not limited to progressively redesigning their content recommendation algorithms to reduce and mitigate risks identified under Article 26(1)(c) to place less emphasis on the retention of attention and more on emphasis of the protection of fundamental human rights and ensuring adequate labelling and notification to users of legal but harmful content on their service;
2021/06/24
Committee: ITRE
Amendment 555 #
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
(a a) adapting content moderation policies and practices to not involve the monitoring or profiling of the behaviour of individuals, unless the online platforms can demonstrate, on the basis of mandatory ex ante human rights impact assessment, that such measures are strictly necessary to mitigate the categories of systemic risks identified in Article 26 and in accordance with Regulation (EU) 2016/679 and Directive 2002/58/EC;
2021/06/24
Committee: ITRE
Amendment 557 #
Proposal for a regulation
Article 27 – paragraph 1 – point c
(c) reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic riskconducting mandatory ex ante human rights impact assessment, including but not limited to ensuring that the data sets that inform the detection of risk do so in all relevant languages in which the services operate and do so in a manner that promotes diversity and inclusion and does not breach fundamental rights to freedom form discrimination;
2021/06/24
Committee: ITRE
Amendment 558 #
Proposal for a regulation
Article 27 – paragraph 1 – point d
(d) initiating or adjusting cooperation with trusted flaggers in accordance with Article 19;
2021/06/24
Committee: ITRE
Amendment 562 #
Proposal for a regulation
Article 27 – paragraph 1 – subparagraph 1 a (new)
The mitigation measures must be designed with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations.
2021/06/24
Committee: ITRE
Amendment 566 #
Proposal for a regulation
Article 27 – paragraph 3
3. The CommissionBoard, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Board, in cooperation with an independent enforcement and monitoring unit of the Commission, shall organise public consultations.
2021/06/24
Committee: ITRE
Amendment 570 #
Proposal for a regulation
Article 27 a (new)
Article 27 a Mitigation of risks for the freedom of expression and freedom and pluralism of the media 1. Where specific systemic risks for the exercise of freedom of expression and freedom and pluralism of the media pursuant to Article 26(1)(b) emerge, very large online platforms shall ensure that the exercise of these fundamental rights is always adequately and effectively protected. 2. Where very large online platforms allow for the dissemination of press publications within the meaning of Art. 2(4) of Directive (EU) 2019/790, of audiovisual media services within the meaning of Article 1(1)(a) of Directive 2010/13/EU (AVMS) or of other editorial media, which are published in compliance with applicable Union and national law under the editorial responsibility and control of a press publisher, audiovisual or other media service provider, who can be held liable under the laws of a Member State, the platforms shall be prohibited from removing, disabling access to, suspending or otherwise interfering with such content or services or suspending or terminating the service providers’ accounts on the basis of the alleged incompatibility of such content with their terms and conditions. 3. Very large online platforms shall ensure that their content moderation, their decision-making processes, the features or functioning of their services, their terms and conditions and recommender systems are objective, fair and non-discriminatory.
2021/06/24
Committee: ITRE
Amendment 574 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
2021/06/24
Committee: ITRE
Amendment 577 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; these should include auditing the processes and procedures enforced by very large online platforms;
2021/06/24
Committee: ITRE
Amendment 580 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisationsthe not- for-profit bodies, organisation or association which have been properly constituted in accordance with the law of a Member State which:
2021/06/24
Committee: ITRE
Amendment 583 #
Proposal for a regulation
Article 28 – paragraph 2 – point c
(c) have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards and transparent record of the compliance with international human rights framework, based in particular on adherence to the EU Charter of Fundamental Rights, the UN Guiding Principles and the Shadow EU Action Plan on the Implementation of the UN Guiding Principles on Business and Human Rights within the EU.
2021/06/24
Committee: ITRE
Amendment 589 #
Proposal for a regulation
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positivecontains evidence of systemic risks stemming from the functioning and use made of their services in the Union shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
2021/06/24
Committee: ITRE
Amendment 595 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as anynd they shall provide options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. The parameters shall include at least : (a) the criteria used by recommender systems, including family of models, input data, performance metrics, and how the model was tested (b) logs of recommended content and criteria used for such recommendations, including their mutual interactions (c) information to the Recipients of the service on where content comes from and reasoning about why it has been recommended in a clear, easily accessible and concise information summary format. (d) to enable Recipients of the service to view profile or profiles used to curate user-generated content for the recipient of the service. Based on such information, Recipients of the service should be able to rectify or request the deletion of profiles.
2021/06/24
Committee: ITRE
Amendment 603 #
Proposal for a regulation
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them, including any technically possible option to turn off algorithmic selection within the recommender system entirely. Very large online platforms shall establish opt-in mechanisms to recommender systems by default as minimum safeguards of data protection by design and by default within the meaning Article 25 of Regulation (EU) 2016/679.
2021/06/24
Committee: ITRE
Amendment 606 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2 a. Recipients of the service who decide to opt-in to recommender systems shall be able to: (a) exclude certain content from their recommendations (b) exclude certain sources of content from their recommendations (c) easily withdraw their choice to opt-in and no longer be a part of recommender system (d) ask for profiles to be deleted (e) access the service even when refusing to use content recommendations, to ensure the opt-in is meaningful. Recipients of the service shall be able to do so in an easy and free manner, and at anytime.
2021/06/24
Committee: ITRE
Amendment 628 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article and to vetted not-for-profit body, organisation or association which has been properly constituted in accordance with the law of a Member State, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1).
2021/06/24
Committee: ITRE
Amendment 635 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, platforms may only require that researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
2021/06/24
Committee: ITRE
Amendment 636 #
Proposal for a regulation
Article 31 – paragraph 4 a (new)
4 a. In order to be vetted, not-for-profit body, organisation or association which has to be properly constituted in accordance with the law of a Member State, has statutory objectives which are in the public interest, and is active in the field of the protection of consumer rights or fundamental rights and freedoms with regard to the protection of their personal data, freedom of expression and opinion, access to information, consumer protection, right to equal treatment and prohibition of discrimination.
2021/06/24
Committee: ITRE
Amendment 643 #
Proposal for a regulation
Article 33 – paragraph 3
3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports. The Digital Services Coordinator should be required to publish its decision as to whether it upholds the platform’s decision giving the reasons for this. This decision must be subject to review by the relevant judiciary body. For the absence of doubt, mere commercial confidentiality shall not be a reason for a very large online platform to fail to disclose the relevant information.
2021/06/24
Committee: ITRE
Amendment 664 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 1
2. Member States shall designate one of the competent authorities with relevant expertise in the field of data protection, consumer protection or regulation of user-generated content as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating tothe application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. In particular, supervisory authorities designated under Regulation (EU) 2016/679 shall be tasked with application and enforcement of measures related to data processing set forth under this Regulation. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters related to this Regulation and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union.
2021/06/24
Committee: ITRE
Amendment 666 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 2 a (new)
The Board should create a publicly accessible list of all national Digital Coordinators and competent national authorities with the relevant expertise designated by Member States that will be regularly updated and monitored by the Board.
2021/06/24
Committee: ITRE
Amendment 669 #
Proposal for a regulation
Article 38 – paragraph 3 a (new)
3 a. Member States shall, via Digital Coordinator(s), provide for a clear legal basis for the cooperation between and among relevant national authorities, each acting within their respective areas of competence. The Digital Coordinator should publicly list all competent authorities that are involved in the cooperation and identify the circumstances in which cooperation should take place, and adopt guidelines for cooperation that include clear deadlines. Such competent authorities should be included in the publicly accessible list administered by the Board, as described in Article 38(2).
2021/06/24
Committee: ITRE
Amendment 672 #
Proposal for a regulation
Article 39 – paragraph 1
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, independent, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources, including skills, competence building and infrastructure, to carry out their tasks.
2021/06/24
Committee: ITRE
Amendment 674 #
Proposal for a regulation
Article 41 – paragraph 2 – point b
(b) the power to order the cessation of infringements and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end. Digital Service coordinators and national authorities with relevant expertise to exercise oversight of this Regulation shall have the power to order the prohibition on the deployment of recommender systems at least until compliance with fundamental rights is guaranteed and the consumer and fundamental rights of online users are sufficiently protected ;
2021/06/24
Committee: ITRE
Amendment 695 #
Proposal for a regulation
Article 48 – paragraph 3
3. The Board shall be chaired by the Commission. The Commissiona president elected within its members on a rotating basis. The chair of the Board should not be allowed to lead any national regulatory office in their respective Member States at the same time. The length of the chair mandate should be limited to a maximum of three years, renewable once. The European Parliament Committee on Civil Liberties, Justice and Home Affairs should hear candidates and designate the chair by vote. The chair of the Board shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure.
2021/06/24
Committee: ITRE
Amendment 699 #
(e a) issue own-initiative opinions;
2021/06/24
Committee: ITRE
Amendment 701 #
Proposal for a regulation
Article 49 – paragraph 1 – point e b (new)
(e b) issue opinions on matters, other than measures taken by the Commission, that contribute to the proper application of this Regulation.
2021/06/24
Committee: ITRE
Amendment 702 #
Proposal for a regulation
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period. A not-for-profit body, organisation or association which has been properly constituted in accordance with the law of a Member State, have statutory objectives which are in the public interest, and are active in the field of the protection of Recipients' fundamental rights and freedoms with regard to the protection of their personal data, freedom of expression and opinion, access to information, consumer protection, right to equal treatment and prohibition of discrimination, may assist in this process and the performance of audits.
2021/06/24
Committee: ITRE
Amendment 703 #
Proposal for a regulation
Article 58 – paragraph 1 – introductory part
1. The Commission shall adopt a non- compliance decision and act according to Article 59 where it finds that the very large online platform concerned does not comply with one or more of the following:
2021/06/24
Committee: ITRE