45 Amendments of Carlo FIDANZA related to 2020/0361(COD)
Amendment 234 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activitiesappropriately differentiated from the concept of "potentially harmful content". In particular, thate concept of "illegal content" should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 259 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider or where the provider prioritises or promotes the content, its presentation or monetisation beyond offering basic search and indexing functionalities that are absolutely necessary to navigate the content.
Amendment 264 #
Proposal for a regulation
Recital 18 a (new)
Recital 18 a (new)
(18a) Those exemptions from liability should also not be available to providers of intermediary services that do not comply with the due diligence obligations in this Regulation. The conditionality should further ensure that the standards to qualify for those exemptions contribute to a high level of safety and trust in the online environment in a manner that promotes a fair balance of the rights of all stakeholders.
Amendment 267 #
Proposal for a regulation
Recital 20
Recital 20
(20) A provider of intermediary services that deliberately collaboratengages with a recipient of the services in order to undertake illegal activities does not provide its service neutrally nor passively and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
Amendment 284 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders,Hosting services should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platformsey present the relevant information relating to the transactions or exchanges at issue in such a way that it leads consumers to believe that the information was provided by those online platformservices providers themselves or by recipients of the service acting under their authority or control, and that those online platformservices providers thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well- informed consumer.
Amendment 296 #
Proposal for a regulation
Recital 25
Recital 25
Amendment 300 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against manifestly illegal content related to serious crimes that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner, diligently and never on a discretionary basis. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 321 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or, administrative or police authorities may order providers ofshould be the only bodies entitled to decide on the removal of specific content, except in the case of manifestly illegal content related to serious crimes, which might require immediate intervention by the intermediary services provider to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
Amendment 325 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
Amendment 329 #
Proposal for a regulation
Recital 30
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulationn Member States to impose a monitoring obligation of a general nature. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
Amendment 335 #
Proposal for a regulation
Recital 31
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.
Amendment 337 #
Proposal for a regulation
Recital 32
Recital 32
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. This information should include the relevant contact details necessary to ensure such compliance. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
Amendment 389 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where aA hosting service provider decidesshould not decide by itself to remove or disable information provided by a recipient of the service, for instanceeither following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. A view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression, unless the content is manifestly illegal and related to serious crimes. Even in the latter case, available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 392 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should prevent the reappearance of the notified or equivalent illegal information. The provider should also inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 408 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal contentcontent that is illegal under existing law can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council43 . __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53he national authorities responsible are followed up immediately. Such authorities must be solely public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’).
Amendment 414 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collectivve significant legitimate interests, and that they work in a diligent and objective manner proven record in flagging content with a high rate of accuracy and have demonstrated competence for the purposes of detecting, identifying and notifying illegal content. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 676 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
Article 2 – paragraph 1 – point f – indent 3
— a ‘hosting’ service that consists of the storage or the allowance of storage of information provided by, and at the request of, a recipient of the service;
Amendment 690 #
Proposal for a regulation
Article 2 – paragraph 1 – point g – point i (new)
Article 2 – paragraph 1 – point g – point i (new)
(i) ‘potentially harmful content’ means content the unlawfulness of which is not beyond reasonable doubt, but which contains suspicious indicators;
Amendment 747 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Article 2a Conditionality to the compliance with due diligence obligations Providers of intermediary services shall be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 when they do not comply with the due diligence obligations set out in Chapter III of this Regulation.
Amendment 754 #
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
Article 5 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the storage or the allowance of storage of information provided by a recipient of the service the service provider shall not be liable for the information stored at the request of a recipient of the service on condition that the provider:
Amendment 771 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platformhosting service provider itself or by a recipient of the service who is acting under its authority or control.
Amendment 774 #
Proposal for a regulation
Article 5 – paragraph 3 a (new)
Article 5 – paragraph 3 a (new)
3a. Providers of intermediary services shall be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 when their main purpose is to engage in or facilitate illegal activities.
Amendment 779 #
Proposal for a regulation
Article 6
Article 6
Amendment 800 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. Under the condition that necessary safeguards are provided, such orders could, in particular, consist of catalogue- wide and dynamic injunctions by courts or administrative authorities requiring the termination or prevention of any infringement.
Amendment 821 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2
Article 8 – paragraph 2 – point a – indent 2
— one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned;
Amendment 837 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10.
Amendment 864 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
Amendment 876 #
Proposal for a regulation
Article 9 – paragraph 2 – point b
Article 9 – paragraph 2 – point b
(b) the order only requires the provider to provide information already collected for the purposes of providingenabling the identification of recipients of the service and which lies within its control;
Amendment 881 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10;
Amendment 1042 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 1058 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. and shall create an obligation on behalf of the notified provider of hosting services to remove or disable access to the notified information expeditiously.
Amendment 1072 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. When a decision has been taken to remove or disable information, the providers of hosting services shall take all necessary measures to prevent the same or equivalent illegal material from reappearing on their service. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 1272 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platformhas a significant legitimate interest, either collectively or as individual entity, is independent from any online platform, and has a proven expertise of flagging illegal content with a high rate of accuracy;
Amendment 1305 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiatedwrongful notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 1554 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
Amendment 1557 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
(a) the dissemination and amplification of illegal content through their services;
Amendment 1561 #
Proposal for a regulation
Article 26 – paragraph 1 – point a a (new)
Article 26 – paragraph 1 – point a a (new)
(aa) the funding of illegal content, including models based on advertisement
Amendment 1568 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for human dignity, private and family life, freedom of expression and information, right to property, the prohibition of discrimination and the rights of the child, as enshrined in Articles 1, 7, 11, 17, 21 and 24 of the Charter respectively;
Amendment 1714 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available to relevant authorities, publishers, advertisers and vetted researchers that meet the requirements listed in paragraph 4 of this Article or Article 31 through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1720 #
Proposal for a regulation
Article 30 – paragraph 2 – point a
Article 30 – paragraph 2 – point a
(a) the content of the advertisement, in particular, the name of the product, service or brand and the object of the advertisement;
Amendment 1733 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached in each country and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.
Amendment 1736 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. When very large online platforms sell advertising for display on their online interface, the contract signed with the buyer or the buyer’s representative includes a clause providing that the platform guarantees that no content adjacent to the advertisement is incompatible with the terms and conditions of the platform or with the law of the Member States of residence of the recipients of the service to whom the advertisement will be displayed. Any clause to the contrary shall be null and void.
Amendment 1742 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
Article 30 – paragraph 2 b (new)
2b. Very large online platforms that display advertising on their online interfaces shall conduct at their own expense, upon the request of advertisers and publishers, independent audits performed by organisations complying with the criteria set in Article 28(2), on a reasonable frequency, under fair and proportionate conditions agreed upon platforms, advertisers and publishers, to: (a) conduct a quantitative and qualitative assessment of cases where advertising is associated with illegal content; (b) detect fraudulent use of their services to fund illegal activities; (c) assess the performance of their tools in terms of brand safety The report shall include an audit opinion on the performance of their tools in terms of brand safety, either positive, positive with comments or negative and where the audit opinion in not positive, operational recommendations on specific measures to achieve compliance. These platforms shall make available to advertisers and publishers, upon their request, the results of that audit.
Amendment 1880 #
Proposal for a regulation
Article 36 – paragraph 1
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30, but also to further transparency between all the players involved in the programmatic advertising value chain.
Amendment 1887 #
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
Article 36 – paragraph 2 – point b a (new)
(ba) the set-up of a common or unique identifier constituted by multiple elements (such as the advertiser identifier and references to the brand of the campaign, its product, and the reference of the purchase) which enables advertisers and publishers to identify and track a campaign throughout its lifecycle.