BETA

47 Amendments of Annalisa TARDINO related to 2020/0361(COD)

Amendment 143 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadlyin the strict observance of the principle of freedom of expression and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as clearly illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in questiona violation of criminal, administrative or civil national legal framework.
2021/06/10
Committee: LIBE
Amendment 157 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiouslyas soon as possible to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content while carefully assessing potential impacts on the freedom of expression.
2021/06/10
Committee: LIBE
Amendment 179 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by children rights trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
2021/06/10
Committee: LIBE
Amendment 182 #
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. Such reporting should mention as well own initiative measures taken to ensure pluralism on the platform. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/06/10
Committee: LIBE
Amendment 190 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision in a clear and user-friendly manner, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/06/10
Committee: LIBE
Amendment 193 #
Proposal for a regulation
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal clear and user-friendly information about redress procedure and complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. Recipients should be entitled to seek redress in the same language of the content that was referred to the internal complaint- handling system. In addition, provision should be made for the possibility of out- of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/06/10
Committee: LIBE
Amendment 196 #
Proposal for a regulation
Recital 46
(46) Action against illegal content involving minors can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by children rights trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such children rights trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal contentprotecting minors, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on children rights trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/06/10
Committee: LIBE
Amendment 221 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy, freedom of expression and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/06/10
Committee: LIBE
Amendment 226 #
Proposal for a regulation
Recital 59
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations.deleted
2021/06/10
Committee: LIBE
Amendment 232 #
Proposal for a regulation
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
2021/06/10
Committee: LIBE
Amendment 239 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.deleted
2021/06/10
Committee: LIBE
Amendment 243 #
Proposal for a regulation
Recital 65
(65) Given the complexity of the functioning of the systems deployed and the systemic risks they present to society, very large online platforms should appoint compliance officers, which should have the necessary qualifications to operationalise measures and monitor the compliance with this Regulation within the platform’s organisation. Compliance officers should be provided with dedicated training on the applicable legal framework to protect freedom of expression. Very large online platforms should ensure that the compliance officer is involved, properly and in a timely manner, in all issues which relate to this Regulation. In view of the additional risks relating to their activities and their additional obligations under this Regulation, the other transparency requirements set out in this Regulation should be complemented by additional transparency requirements applicable specifically to very large online platforms, notably to report on the risk assessments performed and subsequent measures adopted as provided by this Regulation. The provider shall ensure that especially decisions on notices are processed by qualified staff provided with dedicated training on the applicable legal framework to protect freedom of expression.
2021/06/10
Committee: LIBE
Amendment 247 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.deleted
2021/06/10
Committee: LIBE
Amendment 284 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an or activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;criminal, administrative or civil legal framework of a Member State,
2021/06/10
Committee: LIBE
Amendment 302 #
Proposal for a regulation
Article 3 – paragraph 3
3. This Article shall not affect the possibility for a national court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
2021/06/10
Committee: LIBE
Amendment 304 #
Proposal for a regulation
Article 4 – paragraph 2
2. This Article shall not affect the possibility for a national court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
2021/06/10
Committee: LIBE
Amendment 308 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable as soon as possible access to the illegal content in the strict observance of the principle of freedom of expression.
2021/06/10
Committee: LIBE
Amendment 310 #
Proposal for a regulation
Article 5 – paragraph 1 a (new)
1 a. 1a.Without prejudice to specific provisions, set out in Union law or within national administrative or legal frameworks, providers of hosting services shall, upon obtaining actual knowledge or awareness, remove or disable access to illegal content as soon as possible and in any event: (a) within 24 hours where the illegal content can seriously harm public policy, public security including contents promoting terrorism or public health or seriously harm consumers’ health or safety.Such provisions should apply specifically to child sexual abuse material, grooming and cyberbullying. (b) within 4 days in all other cases; Where the provider of hosting services cannot comply with the obligation in paragraph 1a on grounds of force majeure or for objectively justifiable technical or operational reasons, it shall, without undue delay, inform the competent authority having issued an order pursuant to Article 8 or the recipient of the service having submitted a notice pursuant to Article 14, of those grounds.
2021/06/10
Committee: LIBE
Amendment 312 #
Proposal for a regulation
Article 5 – paragraph 4
4. This Article shall not affect the possibility for a national court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
2021/06/10
Committee: LIBE
Amendment 332 #
Proposal for a regulation
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
2021/06/10
Committee: LIBE
Amendment 339 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
clear and user friendly information about redress procedure available to the provider of the service and to the recipient of the service who provided the content;
2021/06/10
Committee: LIBE
Amendment 343 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
- Users should be entitled to seek redress in the same language of the content that was removed,
2021/06/10
Committee: LIBE
Amendment 421 #
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures; as well as own initiative measures taken to ensure pluralism on the platform,
2021/06/10
Committee: LIBE
Amendment 448 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered tomay give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
2021/06/10
Committee: LIBE
Amendment 463 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6 a. The provider shall ensure that decisions on notices are processed by qualified staff provided with dedicated training on the applicable legal framework to protect freedom of expression.
2021/06/10
Committee: LIBE
Amendment 476 #
Proposal for a regulation
Article 15 – paragraph 2 – point f
(f) clear and user-friendly information on the redress possibilities available to the recipient of the service in respect of the decision, in particular through internal complaint- handling mechanisms, out-of- court dispute settlement and judicial redress.
2021/06/10
Committee: LIBE
Amendment 491 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, for a period of at least sixtwelve months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/06/10
Committee: LIBE
Amendment 500 #
Proposal for a regulation
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. Recipients shall be entitled to seek redress in the same language of the content that was referred to the internal complaint-handling system.
2021/06/10
Committee: LIBE
Amendment 534 #
Proposal for a regulation
Article 19 – title
19 Children rights Trusted flaggers
2021/06/10
Committee: LIBE
Amendment 538 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by children rights trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
2021/06/10
Committee: LIBE
Amendment 540 #
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
2. The status of children rights trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:
2021/06/10
Committee: LIBE
Amendment 542 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content involving minors;
2021/06/10
Committee: LIBE
Amendment 544 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests to protect children rights and is independent from any online platform;
2021/06/10
Committee: LIBE
Amendment 556 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.deleted
2021/06/10
Committee: LIBE
Amendment 564 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonabledefined period of time and after having issued athree prior warnings, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
2021/06/10
Committee: LIBE
Amendment 566 #
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
3. Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following:
2021/06/10
Committee: LIBE
Amendment 574 #
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
2021/06/10
Committee: LIBE
Amendment 627 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/06/10
Committee: LIBE
Amendment 631 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/06/10
Committee: LIBE
Amendment 654 #
Proposal for a regulation
Article 27 – paragraph 1 – point d
(d) initiating or adjusting cooperation with trusted flaggers in accordance with Article 19;deleted
2021/06/10
Committee: LIBE
Amendment 664 #
Proposal for a regulation
Article 27 – paragraph 2
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which shall include the following: (a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; (b) best practices for very large online platforms to mitigate the systemic risks identified.deleted
2021/06/10
Committee: LIBE
Amendment 665 #
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;deleted
2021/06/10
Committee: LIBE
Amendment 669 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to mitigate the systemic risks identifideleted.
2021/06/10
Committee: LIBE
Amendment 671 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.deleted
2021/06/10
Committee: LIBE
Amendment 759 #
Proposal for a regulation
Article 32 – paragraph 2
2. Very large online platforms shall only designate as compliance officers persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned. Compliance officers shall have a deep knowledge of the existing legal framework on freedom of expression.
2021/06/10
Committee: LIBE
Amendment 763 #
Proposal for a regulation
Article 32 – paragraph 4
4. Very large online platforms shall take the necessary measures to ensure that the compliance officers can perform their tasks in an independent and non politicized manner.
2021/06/10
Committee: LIBE
Amendment 770 #
Proposal for a regulation
Article 34 – paragraph 1 – point b
(b) electronic submission of notices by trusted flaggers under Article 19, including through application programming interfaces;deleted
2021/06/10
Committee: LIBE