117 Amendments of Marina KALJURAND related to 2020/0361(COD)
Amendment 139 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadappropriately and also covers information relating to illegal content, products, services and activities where such information is itself illegal. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 150 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to meanAccordingly, where access to information requires registration or admittance to a group of users, that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined personshould be considered to be disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulationare not considered disseminated to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 153 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it the provider of intermediary services has knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. The exemptions from liability established in this Regulation should not depend on uncertain notions such as an ‘active’, ‘neutral’ or ‘passive’ role of providers.
Amendment 158 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegalafter having become aware of the unlawful nature of content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 281 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(i a) Directive 2002/58/EC.
Amendment 282 #
Proposal for a regulation
Article 1 – paragraph 5 – subparagraph 1 (new)
Article 1 – paragraph 5 – subparagraph 1 (new)
This Regulation shall not apply to matters relating to information society services covered by Regulation (EU) 2016/679and Directive 2002/58/EC.
Amendment 296 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Article 2 a Digital privacy Where technically possible, a provider of an information society service shall enable the use of and payment for that service without collecting personal data of the recipient. A provider of an information society service shall process personal data concerning the use of the service by a recipient only to the extent strictly necessary to enable the recipient to use the service or to charge the recipient for the use of the service. An operator of an online platform shall be allowed to process personal data concerning the use of the service by a recipient for the sole purpose of operating a recommender system if the recipient has given his or her explicit consent, as defined in Article 4(11) of Regulation (EU) 2016/679. Member States shall not require a provider of information society services to retain personal data concerning the use of the service by all recipients. A provider of an information society service shall have the right to provide and support end-to-end encryption services.
Amendment 298 #
Proposal for a regulation
Article 3 – paragraph 3
Article 3 – paragraph 3
Amendment 305 #
Proposal for a regulation
Article 4 – paragraph 2
Article 4 – paragraph 2
2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 313 #
Proposal for a regulation
Article 5 – paragraph 4
Article 5 – paragraph 4
4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 315 #
Proposal for a regulation
Article 6 – title
Article 6 – title
Amendment 316 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessatake the compulsory measures to comply with the requirements of Union law, including those set out in this Regulation.
Amendment 321 #
Proposal for a regulation
Article 7 – title
Article 7 – title
No general monitoring or, active fact- finding or automated content moderation obligations
Amendment 324 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
No general obligation shall be imposed to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.
Amendment 325 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
Providers of intermediary services shall not be obliged to use automated tools for content moderation.
Amendment 333 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an, via a secure communications channel, of an authenticated order to act against a specific item of illegal content, issued by the relevanta national judicial or administrative authoritiesy, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 336 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
Article 8 – paragraph 2 – point a – indent 1
— the identification details of the judicial authority issuing the order and a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed;
Amendment 340 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
Article 8 – paragraph 2 – point a – indent 3
— information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content;
Amendment 347 #
Proposal for a regulation
Article 8 – paragraph 2 – point a a (new)
Article 8 – paragraph 2 – point a a (new)
(a a) the order is securely and easily authenticated;
Amendment 349 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
Article 8 – paragraph 2 – point b a (new)
(b a) the territorial scope of an order addressed to a provider that has its main establishment, or, if not established in the Union,its legal representation in another Member State is limited to the issuing Member State;
Amendment 351 #
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
Article 8 – paragraph 2 – point b b (new)
(b b) where addressed to a provider that has its main establishment outside the Union, the territorial scope of the order, where Union law is infringed, is limited to the territory of the Union or, where national law is infringed, to the territory of the Member State issuing the order;
Amendment 354 #
Proposal for a regulation
Article 8 – paragraph 3
Article 8 – paragraph 3
3. The Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the orders referred to in paragraph 1 to all other Digital Services Coordinators through the system established in accordance with Article 67.
Amendment 358 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
Article 8 – paragraph 4 a (new)
4 a. The Commission shall, by means of implementing acts, define a European technical standard for the secure communication channels that also provide for the authentication of the orders.
Amendment 360 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an, via a secure communications channel, of an authenticated order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevanta national judicial or administrative authoritiesy on the basis of the applicable Union or national law, in conformity with Union law, for the purpose of preventing serious threats to public security, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order via a secure communications channel.
Amendment 364 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
Article 9 – paragraph 2 – point a – indent 1
— the identification details of the judicial authority issuing the order, a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information isthe grounds for the necessarity and proportionate to determine compliance by the recipielity of the request, taking due accounts of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminalits impact on the fundamental rights of the specific recipient of the service whose data is sought and the seriousness of the offences;
Amendment 369 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
Article 9 – paragraph 2 – point a – indent 1 a (new)
- a unique identifier of the recipients about whom information is sought;
Amendment 371 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
Article 9 – paragraph 2 – point a – indent 2
— information about redress mechanisms available to the provider and to the recipients of the service concerned;
Amendment 376 #
Proposal for a regulation
Article 9 – paragraph 2 – point a a (new)
Article 9 – paragraph 2 – point a a (new)
(a a) the order is securely and easily authenticated;
Amendment 377 #
Proposal for a regulation
Article 9 – paragraph 2 – point a b (new)
Article 9 – paragraph 2 – point a b (new)
(a b) the order is issued for the purpose of preventing serious threats to public security;
Amendment 378 #
Proposal for a regulation
Article 9 – paragraph 2 – point a c (new)
Article 9 – paragraph 2 – point a c (new)
(a c) the order seeks information on a suspect or suspects of a serious threat to public security;
Amendment 379 #
Proposal for a regulation
Article 9 – paragraph 2 – point b
Article 9 – paragraph 2 – point b
(b) the order only requires the provider to provide information already legally collected for the purposes of providing the service and which lies within its control;
Amendment 382 #
Proposal for a regulation
Article 9 – paragraph 3
Article 9 – paragraph 3
3. The Digital Services Coordinator from the Member State of the national judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the order referred to in paragraph 1 to all Digital Services Coordinators through the system established in accordance with Article 67.
Amendment 383 #
Proposal for a regulation
Article 9 – paragraph 3 a (new)
Article 9 – paragraph 3 a (new)
3 a. The provider shall inform the recipient whose data is being sought without undue delay. As long as necessary and proportionate, in order to protect the fundamental rights of another person, the issuing judicial authority, taking into due account the impact of the request on the fundamental rights of the person whose data is sought, may request the provider to delay informing the recipient. Such a request shall be duly justified, specify the duration of the obligation of confidentiality and shall be subject to periodic review.
Amendment 384 #
Proposal for a regulation
Article 9 – paragraph 3 b (new)
Article 9 – paragraph 3 b (new)
3 b. This Article shall apply, mutatis mutandis, to competent administrative authorities ordering online platforms to provide the information listed in Article 22.
Amendment 385 #
Proposal for a regulation
Article 9 – paragraph 3 c (new)
Article 9 – paragraph 3 c (new)
3 c. Where information is sought for the purpose of criminal proceedings, Regulation (EU) 2021/XXXX on access to electronic evidence shall apply.
Amendment 386 #
Proposal for a regulation
Article 9 – paragraph 3 d (new)
Article 9 – paragraph 3 d (new)
3 d. Providers of intermediary services shall transfer the personal data on recipients of their service requested by public authorities only where the conditions of this article are met.
Amendment 387 #
Proposal for a regulation
Article 9 – paragraph 3 e (new)
Article 9 – paragraph 3 e (new)
3 e. The Commission shall, by means of implementing acts, establish a common European information exchange system with secure channels for the handling of authorised cross-border communications, authentication and transmission of the order referred to in paragraph 1 and, where applicable, of the requested data between the competent judicial authority and the provider.
Amendment 429 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13 a Online advertising transparency Providers of intermediary services that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, concise and unambiguous manner and in real time: (a) that the information displayed on the interface or parts thereof is an online advertisement, including through prominent and harmonised marking; (b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement; (c) clear, meaningful and uniform information about the parameters used to determine the recipient to whom the advertisement is displayed; and (e) if the advertisement was displayed using an automated tool and the identity of the person responsible for that tool. 2. The Commission shall adopt an implementing act establishing harmonised specifications for the marking referred to in paragraph 1(a)of this Article. 3. Providers of intermediary services shall inform the natural or legal person on whose behalf the advertisement is displayed where the advertisement has been displayed. They shall also inform public authorities, upon their request. 4. Providers of intermediary services that display advertising on their online interfaces shall be able to give easy access to public authorities, NGOs, and researchers, upon their request, to information related to direct and indirect payments or any other remuneration received to display the corresponding advertisement on their online interfaces.
Amendment 431 #
Proposal for a regulation
Article 13 b (new)
Article 13 b (new)
Article 13 b Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of showing digital advertising to recipients of their service, of other information society services, or directly to the public. 2. Providers of intermediary services may show targeted digital advertising based on contextual information. 3. The use of the contextual information referred to in paragraph 2 shall be permissible only if it does not allow for the direct or indirect identification of a natural person, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
Amendment 432 #
Proposal for a regulation
Article 13 c (new)
Article 13 c (new)
Article 13 c Recipients’ consent for advertising practices 1. Providers of intermediary services shall not, by default, subject the recipients of their services to targeted, micro-targeted and behavioural advertisement, unless the recipient of the service has expressed a freely given, specific, informed and unambiguous consent to receiving such advertising. Providers of intermediary services shall ensure that recipients of services can easily make an informed choice when expressing their consent by providing them with meaningful information about the use of their personal data. 2. When processing personal data for targeted, micro-targeted and behavioural advertising, where consent has been received, online intermediaries shall comply with relevant Union law and shall not engage in activities that can lead to pervasive tracking, such as disproportionate combination of data collected by platforms, or disproportionate processing of special categories of personal data. 3. Providers of intermediary services shall organise their online interface in a way that provides clear information regarding the advertising parameters and allows the recipients of services to easily and efficiently access and modify those advertising parameters. Providers of intermediary services shall regularly monitor the use of advertising parameters by the recipients of services and make improvements to their use where necessary.
Amendment 442 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
Article 14 – paragraph 2 – point c
Amendment 447 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concernedThe individual or entity submitting the notice may choose to provide their name and an electronic mail address that shall not be disclosed to the content provider except in cases of alleged violations of intellectual property rights.
Amendment 449 #
4 a. Upon receipt of the notice and using available contact details, the service provider shall notify the provider of the information regarding the elements referred to in paragraph 2 and give them the opportunity to reply before taking a decision.
Amendment 450 #
Proposal for a regulation
Article 14 – paragraph 4 b (new)
Article 14 – paragraph 4 b (new)
4 b. Notified information shall remain accessible until a decision is taken in respect of that information.
Amendment 451 #
Proposal for a regulation
Article 14 – paragraph 4 c (new)
Article 14 – paragraph 4 c (new)
4 c. The provider shall ensure that decisions on notices are taken by qualified staff, to whom adequate initial and ongoing training on the applicable legislation and fundamental rights standards as well as appropriate working conditions are to be provided, including, where necessary, the opportunity to seek qualified legal advice.
Amendment 453 #
Proposal for a regulation
Article 14 – paragraph 5
Article 14 – paragraph 5
5. The provider shall also, without undue delay, notify thate individual or entity that provided the notification, as well as the provider or the information, of its decision in respect of the information to which the notice relates, as well as providing information on the redress possibilities in respect of that decision.
Amendment 459 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objectivenon-arbitrary manner. Where they use automated means for that processing or decision-making, they shall include information on such useuse of such automated means in the notification referred to in paragraph 4.
Amendment 467 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decisionit, and where the notifier chose to provide contact details, it shall inform the recipientm, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 475 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means used in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
Amendment 483 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15 a Content moderation 1. Providers of hosting services shall not use ex-ante control measures based on automated tools or upload-filtering of content for content moderation. Where providers of hosting services otherwise use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are provided to staff, including, where necessary, the opportunity to seek professional support, qualified psychological assistance and qualified legal advice. This paragraph shall not apply where information has likely been provided by automated tools. 2. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory, diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 515 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
The first subparagraph is without prejudice to the right of the recipient concerned to seek redress against the decision before a court in accordance with the applicable law.
Amendment 519 #
Proposal for a regulation
Article 18 – paragraph 2 – point a a (new)
Article 18 – paragraph 2 – point a a (new)
(a a) it includes legal experts;
Amendment 521 #
Proposal for a regulation
Article 18 – paragraph 2 – point b
Article 18 – paragraph 2 – point b
(b) it has the necessary expertise in relation to the issues arising issues concerning one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, therefore allowing the body to contribute effectively to the settlement of a dispute;
Amendment 522 #
Proposal for a regulation
Article 18 – paragraph 2 – point d
Article 18 – paragraph 2 – point d
(d) it is capable of settling disputes in a swift, efficient and cost-effective manner and in at least one official language of the Union;
Amendment 527 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 2
Article 18 – paragraph 3 – subparagraph 2
Certified out-of-court dispute settlement bodies shall make the fees, or the mechanisms used to determine the fees, known to the recipient of the services and the online platform concerned before engaging in the dispute settlementpublicly available.
Amendment 550 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or, inadequately substantiated noticesor incorrect notices, or notices regarding legal content, through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 553 #
Proposal for a regulation
Article 19 – paragraph 6
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received byfrom third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
Amendment 558 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal contenthas received two or more orders to act regarding illegal content in the previous 12 months.
Amendment 569 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal contentsuspensions of service and items orf manifestly unfounded notices or complaints, submitted in the past year;
Amendment 580 #
Proposal for a regulation
Article 21 – paragraph 1
Article 21 – paragraph 1
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take placeis imminent, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its reasoned suspicion and provide all relevantthe information availablegiving rise to it.
Amendment 585 #
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place andor likely to take place, or the Member State where thea suspected offender resides or is located, or the Member State where thea victim of the suspected offence resides or is located.
Amendment 588 #
Proposal for a regulation
Article 22 – paragraph 1 – point b
Article 22 – paragraph 1 – point b
(b) a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council50 ; _________________ 50 Regthe number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted after the receipt of mulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/ECple orders to act, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
Amendment 623 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the design, functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 624 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition ofprivacy, protection of personal data, discrimination, equality and the rights of the child,ren as enshprescrinbed in Articles 7, 11, 21 and 24 of the Charter respectivelyUnion or Member State law;
Amendment 628 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) malfunctioning or intentional manipulation of their service, including by means of inauthentic use, undisclosed paid influence, or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, and other categories of vulnerable service users, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 633 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting, targeting, and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
Amendment 636 #
Proposal for a regulation
Article 27 – title
Article 27 – title
Amendment 639 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place transparent, reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
Amendment 643 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shallmay put in place reasonable, proportionate and effective mitigationspecific measures, tailored to the specific systemic risks identified pursuant to Article 26o address the dissemination of illegal content through their services. Such measures may include, where applicable:
Amendment 646 #
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
Article 27 – paragraph 1 – point a a (new)
(a a) appropriate technical and operational measures or capacities, such as appropriate staffing or technical means to expeditiously remove or disable access to illegal content the platform is aware of, or has received an order to act upon;
Amendment 647 #
Proposal for a regulation
Article 27 – paragraph 1 – point a b (new)
Article 27 – paragraph 1 – point a b (new)
(a b) easily accessible and user-friendly mechanisms for users to report or flag allegedly illegal content, and mechanisms for user moderation;
Amendment 651 #
Proposal for a regulation
Article 27 – paragraph 1 – point c
Article 27 – paragraph 1 – point c
(c) reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic risk;
Amendment 655 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
Article 27 – paragraph 1 – point e
Amendment 658 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
Article 27 – paragraph 1 – point e
(e) initiating or adjusting cooperation with other online platforms and stakeholders through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively.
Amendment 660 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1 a. Where a very large online platform decides not to put in place any of the mitigating measures listed in article 27.1, it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in article 28.3.
Amendment 662 #
Proposal for a regulation
Article 27 – paragraph 2
Article 27 – paragraph 2
Amendment 670 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
Article 27 – paragraph 2 – point b
(b) best practices and recommendations for very large online platforms to effectively mitigate the systemic risks identified.
Amendment 675 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelinerecommendations on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommendpropose possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelinerecommendations the Commission shall organise public consultations.
Amendment 676 #
Proposal for a regulation
Article 27 – paragraph 3 a (new)
Article 27 – paragraph 3 a (new)
3 a. After establishing that a very large online platform has received a substantial number of orders to act, the competent Digital Services Coordinator may request necessary, proportionate and effective additional specific measures that the platform is obliged to implement. The competent Digital Services Coordinator shall not impose a general monitoring obligation or the use of automated tools. The request shall take into account, in particular, the technical feasibility of the measures, the size and economic capacity of the platform and the effect of such measures on the fundamental rights of the users and on the freedom of expression and the freedom to receive and impart information and ideas in an open and democratic society. Such a request shall be sent by the Digital Services Coordinator of the Member State in which the platform has its main establishment, or, if not established in the Union, its legal representative. The platform may, at any time, request the competent Digital Services Coordinator to review and, where appropriate, revoke such request.
Amendment 678 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to external independent audits to assess compliance with the following:
Amendment 679 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
Amendment 681 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:
Amendment 684 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
(a) Compliance with the obligations set out in Chapter III;
Amendment 685 #
Proposal for a regulation
Article 28 – paragraph 1 – point a a (new)
Article 28 – paragraph 1 – point a a (new)
(a a) Adequacy of the risk assessment undertaken pursuant to Article 26.1 and the corresponding risk mitigation measures undertaken pursuant to Article 27.1;
Amendment 686 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
(b) Compliance with any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.
Amendment 687 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
(b) any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37and self- or co-regulatory actions that they have undertaken.
Amendment 688 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by expert organisations, previously vetted by the Board, which:
Amendment 689 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations, vetted by the Board, which:
Amendment 690 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned as well as from other very large online platforms;
Amendment 691 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
Article 28 – paragraph 2 – point a
(a) are independent from and do not have conflicts of interest with the very large online platform concerned;
Amendment 692 #
Proposal for a regulation
Article 28 – paragraph 2 – point b
Article 28 – paragraph 2 – point b
(b) have provendemonstrated expertise in the area of risk management, technical competence and capabilities, and, where applicable, can demonstrably draw upon expertise in fields related to the risks investigated or related research methodologies;
Amendment 693 #
Proposal for a regulation
Article 28 – paragraph 2 – point c
Article 28 – paragraph 2 – point c
(c) have provendemonstrated objectivity and professional ethics, based in particular on adherence to relevant codes of practice or appropriate standards.
Amendment 694 #
Proposal for a regulation
Article 28 – paragraph 3 – introductory part
Article 28 – paragraph 3 – introductory part
3. The organisations that perform the audits shall establish an meaningful, granular, comprehensive and independent audit report for each audit. The report shall be in writing and include at least the following:
Amendment 695 #
Proposal for a regulation
Article 28 – paragraph 3 – introductory part
Article 28 – paragraph 3 – introductory part
3. The organisations that perform the audits shall establish an meaningful, granular, comprehensive audit report for each audit. The report shall be in writing and include at least the following:
Amendment 696 #
Proposal for a regulation
Article 28 – paragraph 3 – point d
Article 28 – paragraph 3 – point d
(d) a description of the main findings drawn from the audit and a summary of the main findings;
Amendment 697 #
Proposal for a regulation
Article 28 – paragraph 3 – point d a (new)
Article 28 – paragraph 3 – point d a (new)
Amendment 698 #
Proposal for a regulation
Article 28 – paragraph 3 – point d b (new)
Article 28 – paragraph 3 – point d b (new)
(d b) a description of the third-parties consulted to inform the audit;
Amendment 699 #
Proposal for a regulation
Article 28 – paragraph 3 – point e
Article 28 – paragraph 3 – point e
(e) an audit opinion on whether the very large online platform subject to the audit meaningfully complied with the obligations and with the commitments referred to in paragraph 1, either positive, positive with comments or negative;
Amendment 704 #
Proposal for a regulation
Article 28 – paragraph 4
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operationalshall ensure auditors have access to all relevant information to perform their duties. Very large online platforms receiving an audit report that contains evidence of wrongdoings shall ensure to apply the recommendations addressed to them with a view to take all the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
Amendment 705 #
Proposal for a regulation
Article 28 – paragraph 4 – subparagraph 1 (new)
Article 28 – paragraph 4 – subparagraph 1 (new)
Auditors shall submit their audit report to the Board at the same time as the very large online platform concerned. Within a reasonable period of time, the Board shall issue recommendations, monitor the implementation of the report and suggest the adoption of sanctions by the competent Digital Service Coordinator when the very large online platform fails to abide by the Regulation.
Amendment 706 #
Proposal for a regulation
Article 28 – paragraph 4 – point 1 (new)
Article 28 – paragraph 4 – point 1 (new)
(1) The Board, after consulting stakeholders and the Commission, shall publish guidelines about how audits should be conducted by the auditors, how they should be implemented by very large online platforms and how authorities will monitor and enforce the Regulation in this regard.
Amendment 707 #
Proposal for a regulation
Article 28 – paragraph 4 – point 2 (new)
Article 28 – paragraph 4 – point 2 (new)
(2) The Board shall publish and regularly update a list of vetted auditors that very large online platforms can resort to. The Board shall publish and regularly review detailed criteria auditors need to meet.
Amendment 709 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, meaningful information about the logic involved and the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including the provision of at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. Basing recommender systems on profiling shall require the explicit consent of the recipient, as defined in Article 4, point (11), of Regulation (EU) 2016/679.
Amendment 716 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1 a. Very large online platforms that use recommender systems shall allow the recipient of the service to have information presented to them in a chronological order only and alternatively, where technically possible, to use third-party recommender systems. Third-party recommender systems shall have access to the same information that is available to the recommender systems used by the platform.
Amendment 738 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1)in the public interest.
Amendment 743 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. This shall include personal data only where it is lawfully accessible by the public.
Amendment 758 #
Proposal for a regulation
Article 31 – paragraph 7 a (new)
Article 31 – paragraph 7 a (new)
7 a. Upon completion of their research, the vetted researchers, who have been granted access to the data, shall publish their findings.
Amendment 760 #
Proposal for a regulation
Article 32 – paragraph 2
Article 32 – paragraph 2
2. Very large online platforms shall only designate as compliance officers persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3 as compliance officers. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned.
Amendment 765 #
Proposal for a regulation
Article 33 – paragraph 2 – point a
Article 33 – paragraph 2 – point a
Amendment 766 #
Proposal for a regulation
Article 33 – paragraph 2 – point b
Article 33 – paragraph 2 – point b
(b) the related risk mitigation measures identified andspecific measures implemented pursuant to Article 27;
Amendment 769 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33 a Interoperability 1. By 31 December 2024 very large online platforms shall make the main functionalities of their services interoperable with other online platforms to enable cross-platform exchange of information. This obligation shall not limit, hinder or delay their ability to solve security issues. Very large online platforms shall publicly document all application programming interfaces they make available. 2. The Commission shall adopt implementing measures specifying the nature and scope of the obligations set out in paragraph 1.
Amendment 782 #
Proposal for a regulation
Article 36 – paragraph 1
Article 36 – paragraph 1
Amendment 783 #
Proposal for a regulation
Article 36 – paragraph 2 – introductory part
Article 36 – paragraph 2 – introductory part
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct address at least:
Amendment 784 #
Proposal for a regulation
Article 36 – paragraph 2 – point a
Article 36 – paragraph 2 – point a
(a) the transmission of information held by providers of online advertising intermediaries to recipients of the service with regard to requirements set in Articles 13a(new), 13b(new) and points (b) and (c) of Article 24;
Amendment 786 #
Proposal for a regulation
Article 37
Article 37
Amendment 799 #
Proposal for a regulation
Article 41 – paragraph 1 – point a
Article 41 – paragraph 1 – point a
(a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period, with the exception of information covered by professional secrecy requirements;
Amendment 812 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
Article 44 – paragraph 2 – point a
(a) the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;