BETA

Activities of Cornelia ERNST related to 2018/0331(COD)

Shadow reports (1)

RECOMMENDATION FOR SECOND READING on the Council position at first reading with a view to the adoption of a regulation of the European Parliament and of the Council on addressing the dissemination of terrorist content online
2021/04/21
Committee: LIBE
Dossiers: 2018/0331(COD)
Documents: PDF(168 KB) DOC(53 KB)
Authors: [{'name': 'Patryk JAKI', 'mepid': 197516}]

Amendments (109)

Amendment 37 #
Proposal for a regulation
Title 1
Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on prevenlimiting the dissemination of terrorist content online A contribution from the European Commission to the Leaders’ meeting in Salzburg on 19-20 September 2018
2019/02/25
Committee: LIBE
Amendment 43 #
Proposal for a regulation
Recital 1
(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and informationfundamental rights, in particular the right to freedom of expression and information, the right to privacy and to protection of personal data.
2019/02/25
Committee: LIBE
Amendment 53 #
Proposal for a regulation
Recital 2
(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online. Of particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.
2019/02/25
Committee: LIBE
Amendment 55 #
Proposal for a regulation
Recital 3
(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services.deleted
2019/02/25
Committee: LIBE
Amendment 61 #
Proposal for a regulation
Recital 4
(4) Efforts at Union level to counter terrorist content online commenced in 2015 through aA clear legislative framework ofbuilds on voluntary cooperation between Member States and hosting service providers need to be complemented by a clear legislative framework in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European Parliament to strengthen measures to tackle illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts, which were reinforced by the Commission Recommendation (EU) 2018/3347. _________________ 7Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).
2019/02/25
Committee: LIBE
Amendment 71 #
Proposal for a regulation
Recital 5
(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met. _________________ 8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).
2019/02/25
Committee: LIBE
Amendment 76 #
Proposal for a regulation
Recital 6
(6) Rules to prevent the misuse of hosting services forlimit the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market are set out in this Regulation in full respect of the fundamental rights protected in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.
2019/02/25
Committee: LIBE
Amendment 78 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the right to freedom of expression and information as well as the right to privacy, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law. This Regulation should not affect the applicable rules on the processing of personal data, notably Regulation (EU) 2016/679 and Directive (EU) 2016/680.
2019/02/25
Committee: LIBE
Amendment 97 #
Proposal for a regulation
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocamanifestly incites the commission or contribution tof terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their demonstrable potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content. _________________ 9Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
2019/02/25
Committee: LIBE
Amendment 101 #
Proposal for a regulation
Recital 9 a (new)
(9 a) Where the disseminated material is published under the editorial responsibility of a content provider, any decision as to the removal of such content can only be made based on a judicial order. This is necessary in order to fully respect the law of the Union and the right to freedom of expression and the right to freedom and pluralism of the media as enshrined in Article 11 of the Charter of Fundamental Rights.
2019/02/25
Committee: LIBE
Amendment 107 #
Proposal for a regulation
Recital 10
(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive naturee public. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviewse public. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.
2019/02/25
Committee: LIBE
Amendment 117 #
Proposal for a regulation
Recital 12
(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non- discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information.
2019/02/25
Committee: LIBE
Amendment 122 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task, as long as they are independent and impartial public authorities. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that the access to the terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Uniondisabled without undue delay.
2019/02/25
Committee: LIBE
Amendment 132 #
Proposal for a regulation
Recital 14
(14) The competent authority should transmit the removal order directly to the addressee in the same Member State and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 . _________________ 12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).
2019/02/25
Committee: LIBE
Amendment 134 #
Proposal for a regulation
Recital 15
(15) Referrals by the competent authorities or Europol constitute an effective and swift means of making hosting service providers aware of specific content on their services. This mechanism of alerting hosting service providers to information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility its own terms and conditions, should remain available in addition to removal orders. It is important that hosting service providers assess such referrals as a matter of priority and provide swift feedback about action taken. The ultimate decision about whether or not to remove the content because it is not compatible with their terms and conditions remains with the hosting service provider. In implementing this Regulation related to referrals, Europol’s mandate as laid down in Regulation (EU) 2016/79413 remains unaffected. _________________ 13Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).deleted
2019/02/25
Committee: LIBE
Amendment 145 #
Proposal for a regulation
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including may bye using automated means in certain cases, are an essential element ined to tacklinge terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content. This Regulation does not create any obligation on hosting service providers to put in place proactive measures.
2019/02/25
Committee: LIBE
Amendment 147 #
Proposal for a regulation
Recital 17
(17) When puttingIf they decide to put in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preservguaranteed. They should therefore perform and make public a risk assessment on the level of exposure to terrorism content that is also based on the number of removal orders and referrals received, as well as draw up a remedial action plan to tackle terrorist content proportionate to the level of risk identified. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.
2019/02/25
Committee: LIBE
Amendment 154 #
Proposal for a regulation
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into accparticular, the hosting service provider shall provide the competent authorities with all necessary information abount relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union)the automated tools used to allow a thorough public oversight on the effectiveness of the tools and to ensure that the latter do not produce discriminatory, untargeted, unspecific or unjustified results.
2019/02/25
Committee: LIBE
Amendment 164 #
(19) Following the request, tThe competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authorityadopted by the hosting service provider. Such dialogue should not lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. The competent authority and the hosting service provider should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the right to freedom of expression and information, the right to privacy and the freedom to conduct a business, and provide appropriate justification.
2019/02/25
Committee: LIBE
Amendment 170 #
Proposal for a regulation
Recital 20
(20) The obligation on hosting service providers to preserve removed content and related data, should be laid down for specific purposes and limited in time to what is necessary. There is need to extend the preservation requirement to related data to the extent that any such data would otherwise be lost as a consequence of the removal of the content in question. Related data can include data such as ‘subscriber data’, including in particular data pertaining to the identity of the content provider as well as ‘access data’, including for instance data about the date and time of use by the content provider, or the log-in to and log-off from the service, together with the IP address allocated by the internet access service provider to the content provider.deleted
2019/02/25
Committee: LIBE
Amendment 174 #
Proposal for a regulation
Recital 21
(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity. Where companies remove material or disable access to it, in particular through their own proactive measures, and do not inform the relevant authority because they assess that it does not fall in the scope of Article 13(4) of this Regulation, law enforcement may be unaware of the existence of the content. Therefore, the preservation of content for purposes of prevention, detection, investigation and prosecution of terrorist offences is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can therefore contribute to prosecuting terrorist offences or to preventing serious risks to public security.
2019/02/25
Committee: LIBE
Amendment 179 #
Proposal for a regulation
Recital 22
(22) To ensure proportionality, the period of preservation should be limited to six months to allow the content providers sufficient time to initiate the review process and to enable law enforcement access to relevant data for the investigation and prosecution of terrorist offences. However, this period may be prolonged for the period that is necessary in case the review proceedings are initiated but not finalised within the six months period upon request by the authority carrying out the review. This duration should be sufficient to allow law enforcement authorities to preserve the necessary evidence in relation to investigations, while ensuring the balance with the fundamental rights concerned.
2019/02/25
Committee: LIBE
Amendment 192 #
Proposal for a regulation
Recital 25
(25) Complaint procedures constitute a necessary safeguard against erroneous removal of content protected under the freedom of expression and information. Hosting service providers should therefore establish user-friendly complaint mechanisms and ensure that complaints are dealt with promptly and in full transparency towards the content provider. The requirement for the hosting service provider to reinstate the content where it has been removed in error, does not affect the possibility of hosting service providers to enforce their own terms and conditions on other grounds.
2019/02/25
Committee: LIBE
Amendment 196 #
Proposal for a regulation
Recital 26
(26) Effective legal protection according to Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union requires that persons are able to ascertain the reasons upon which the content uploaded by them has been removed or access to it disabled. For that purpose, the hosting service provider should make available to the content provider meaningful information enabling the content provider to contest the decision. However, this does not necessarily require a notification to the content provider. Depending on the circumstances, hosting service providers may replace content which is considered terrorist content, with a message that it has been removed or disabled in accordance with this Regulation. Further information about the reasons as well as possibilities for the content provider to contest the decision should be given upon request. Where competent authorities decide that for reasons of public security including in the context of an investigation, it is considered inappropriate or counter- productive to directly notify the content provider of the removal or disabling of content, they should inform the hosting service provideralways be given without the need for a request.
2019/02/25
Committee: LIBE
Amendment 204 #
Proposal for a regulation
Recital 27
(27) In order to avoid duplication and possible interferences with investigations, the competent authorities should inform, coordinate and cooperate with each other and where appropriate with Europol when issuing removal orders or sending referrals to hosting service providers. In implementing the provisions of this Regulation, Europol could provide support in line with its current mandate and existing legal framework.
2019/02/25
Committee: LIBE
Amendment 205 #
Proposal for a regulation
Recital 28
(28) In order to ensure the effective and sufficiently coherent implementation of proactive measures, competent authorities in Member States should liaise with each other with regard to the discussions they have with hosting service providers as to the identification, implementation and assessment of specific proactive measures. Similarly, sCompetent authorities in Member States should liaise with each other with a view to issuing a removal order in their respective jurisdiction. Such cooperation is also needed in relation to the adoption of rules on penalties, as well as the implementation and the enforcement of penalties.
2019/02/25
Committee: LIBE
Amendment 209 #
Proposal for a regulation
Recital 29
(29) It is essential that the competent authority within the Member State responsible for imposing penalties is fully informed about the issuing of removal orders and referrals and subsequent exchanges between the hosting service provider and the relevant competent authority. For that purpose, Member States should ensure appropriate communication channels and mechanisms allowing the sharing of relevant information in a timely manner.
2019/02/25
Committee: LIBE
Amendment 217 #
Proposal for a regulation
Recital 32
(32) The competent authorities in the Member States should be allowed to use such information to take investigatory measures available under Member State or Union law, including issuing a European Production Order under Regulation on European Production and Preservation Orders for electronic evidence in criminal matters14 . _________________ 14 COM(2018)225 final.deleted
2019/02/25
Committee: LIBE
Amendment 223 #
(33) Both hosting service providers and Member States should establish points of contact to facilitate the swift handling of removal orders and referrals. In contrast to the legal representative, the point of contact serves operational purposes. The hosting service provider’s point of contact should consist of any dedicated means allowing for the electronic submission of removal orders and referrals and of technical and personal means allowing for the swift processing thereof. The point of contact for the hosting service provider does not have to be located in the Union and the hosting service provider is free to nominate an existing point of contact, provided that this point of contact is able to fulfil the functions provided for in this Regulation. With a view to ensure that terrorist content is removed or access to it is disabled within one hourout undue delay from the receipt of a removal order, hosting service providers should ensure that the point of contact is reachable 24/7. The information on the point of contact should include information about the language in which the point of contact can be addressed. In order to facilitate the communication between the hosting service providers and the competent authorities, hosting service providers are encouraged to allow for communication in one of the official languages of the Union in which their terms and conditions are available.
2019/02/25
Committee: LIBE
Amendment 224 #
Proposal for a regulation
Recital 34
(34) In the absence of a general requirement for service providers to ensure a physical presence within the territory of the Union, there is a need to ensure clarity under which Member State's jurisdiction the hosting service provider offering services within the Union falls. As a general rule, the hosting service provider falls under the jurisdiction of the Member State in which it has its main establishment or in which it has designated a legal representative. Nevertheless, where another Member State issues a removal order, its authorities should be able to enforce their orders by taking coercive measures of a non-punitive nature, such as penalty payments. With regards to a hosting service provider which has no establishment in the Union and does not designate a legal representative, any Member State should, nevertheless, be able to issue penalties, provided that the principle of ne bis in idem is respected.
2019/02/25
Committee: LIBE
Amendment 231 #
Proposal for a regulation
Recital 37
(37) For the purposes of this Regulation, Member States should designate independent and impartial public competent authorities. The requirement to designate competent authorities does not necessarily require the establishment of new authorities but can be existing bodies tasked with the functions set out in this Regulation. This Regulation requires designating authorities competent for issuing removal orders, referrals and for overseeing proactive measures and for imposing penalties. It is for Member States to decide how many authorities they wish to designate for these tasks.
2019/02/25
Committee: LIBE
Amendment 235 #
Proposal for a regulation
Recital 38
(38) Penalties are necessary to ensure the effective implementation by hosting service providers of the obligations pursuant to this Regulation. Member States should adopt rules on penalties, including, where appropriate, fining guidelines. Particularly severe penalties shall be ascertained in the event that the hosting service provider systematically fails to remove terrorist content or disable access to it within one hour from receipt of a removal order. Non- compliance in individual cases could be sanctioned while respecting the principles of ne bis in idem and of proportionality and ensuring that such sanctions take account of systematic failure. In order to ensure legal certainty, the regulation should set out to what extent the relevant obligations can be subject to penalties. Penalties for non-compliance with Article 6 should only be adopted in relation to obligations arising from a request to report pursuant to Article 6(2) or a decision imposing additional proactive measures pursuant to Article 6(4). When determining whether or not financial penalties should be imposed, due account should be taken of the financial resources of the provider. Member States shall ensure that penalties do not encourage the removal of content which is not terrorist content.
2019/02/25
Committee: LIBE
Amendment 245 #
Proposal for a regulation
Recital 38
(38) Penalties arcan be necessary to ensure the effective implementation by hosting service providers of the obligations pursuant to this Regulation. Member States should adopt rules on penalties, including, where appropriate, fining guidelines. Particularly severe penalties shall be ascertained in the event that the hosting service provider systematically fails to remove terrorist content or disable access to it within one hour from receipt of a removal order. Non-compliance in individual cases could be sanctioned while respecting the principles of ne bis in idem and of proportionality and ensuring that such sanctions take account of systematic failure. In order to ensure legal certainty, the regulation should set out to what extent the relevant obligations can be subject to penalties. Penalties for non-compliance with Article 6 should only be adopted in relation to obligations arising from a request to report pursuant to Article 6(2) or a decision imposing additional proactive measures pursuant to Article 6(4). When determining whether or not financial penalties should be imposed, due account should be taken of the financial resources of the provider. Member States shall ensure that penalties do not encourage the removal of content which is not terrorist content.
2019/02/25
Committee: LIBE
Amendment 257 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down uniform rules to prevent the misuse of hosting services forlimit the dissemination of terrorist content online. It lays down in particular:
2019/02/25
Committee: LIBE
Amendment 260 #
Proposal for a regulation
Article 1 – paragraph 1 – point a
(a) rules on duties of care to be applied by hosting service providers in order to prevent the dissemination of terrorist content through their services and ensure, where necessary, its swift removal;deleted
2019/02/25
Committee: LIBE
Amendment 267 #
Proposal for a regulation
Article 1 – paragraph 1 – point b
(b) a set of measures to be put in place by Member States to identify terrorist content, to enable its swift removal by hosting service providers in accordance with Union law providing suitable safeguards for fundamental rights and to facilitate cooperation with the competent authorities in other Member States, hosting service providers and where appropriate relevant Union bodies.
2019/02/25
Committee: LIBE
Amendment 278 #
Proposal for a regulation
Article 1 – paragraph 2 a (new)
2 a. This Regulation is without prejudice to Article 14 and 15 of Directive 2000/31/EC.
2019/02/25
Committee: LIBE
Amendment 287 #
Proposal for a regulation
Article 2 – paragraph 1 – point 1
(1) 'hosting service provider' means a provider of information society services directed at end-users consisting in the storage of information provided by and at the request of the content provider and in making the information stored available to third partiese public. This excludes the provision of Internet infrastructure services at layers other than the application layer, as well as electronic communication services in the meaning of Directive (EU) 2018/1972;
2019/02/25
Committee: LIBE
Amendment 306 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – introductory part
(5) 'terrorist content' means one or more of the following information:any material that manifestly incites the commission of terrorist offences as listed in Article 3(1) of Directive 2017/741/EU in accordance with the definition provided for by national law, thereby causing a clear and substantial danger that such acts be committed. Materials disseminated for educational, journalistic and research purposes, as well as the expression of radical, polemic or controversial views in the public debate on sensitive political questions, should not be considered as terrorist content;
2019/02/25
Committee: LIBE
Amendment 314 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point a
(a) inciting or advocating, including by glorifying, the commission of terrorist offences, thereby causing a danger that such acts be committed;deleted
2019/02/25
Committee: LIBE
Amendment 322 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point b
(b) encouraging the contribution to terrorist offences;deleted
2019/02/25
Committee: LIBE
Amendment 328 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point c
(c) promoting the activities of a terrorist group, in particular by encouraging the participation in or support to a terrorist group within the meaning of Article 2(3) of Directive (EU) 2017/541;deleted
2019/02/25
Committee: LIBE
Amendment 335 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point d
(d) instructing on methods or techniques for the purpose of committing terrorist offences.deleted
2019/02/25
Committee: LIBE
Amendment 351 #
Proposal for a regulation
Article 2 – paragraph 1 – point 6
(6) ‘dissemination of terrorist content’ means making terrorist content publicly available to third parties on the hosting service providers’ services;
2019/02/25
Committee: LIBE
Amendment 357 #
Proposal for a regulation
Article 2 – paragraph 1 – point 8
(8) 'referral' means a notice by a competent authority or, where applicable, a relevant Union body to a hosting service provider about information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility with its own terms and conditions aimed to prevent dissemination of terrorism content;deleted
2019/02/25
Committee: LIBE
Amendment 362 #
Proposal for a regulation
Article 2 – paragraph 1 – point 9 a (new)
(9a) ‘competent authority’ means an independent and impartial public authority under national law.
2019/02/25
Committee: LIBE
Amendment 364 #
Proposal for a regulation
Article 3 – title
Duties of careGeneral principles
2019/02/25
Committee: LIBE
Amendment 372 #
Proposal for a regulation
Article 3 – paragraph 2
2. Hosting service providers shall include in their terms and conditions, and apply, provisions to prevent the dissemination of terrorist content.deleted
2019/02/25
Committee: LIBE
Amendment 386 #
Proposal for a regulation
Article 4 – paragraph 1
1. The competent authority shall have the power to issue a decisionn order requiring the hosting service provider to remove terrorist content or disable access to idisable access to terrorist content.
2019/02/25
Committee: LIBE
Amendment 388 #
Proposal for a regulation
Article 4 – paragraph 1 a (new)
1a. Where material is published under the editorial responsibility of a content provider, any removal order can only become effective based on a judicial order.
2019/02/25
Committee: LIBE
Amendment 396 #
Proposal for a regulation
Article 4 – paragraph 2
2. Hosting service providers shall removedisable access to terrorist content or disable access to it within one hourwithout undue delay from receipt of the removal order.
2019/02/25
Committee: LIBE
Amendment 400 #
Proposal for a regulation
Article 4 – paragraph 3 – point b
(b) a detailed statement of reasons explaining why the content is considered terrorist content, at least, by reference to the categories of terrorist content listed in Article 2(5);
2019/02/25
Committee: LIBE
Amendment 412 #
Proposal for a regulation
Article 4 – paragraph 3 – point g
(g) where relevant, the decision not to disclose information about the removal of terrorist content or the disabling of access to iterrorist content referred to in Article 11.
2019/02/25
Committee: LIBE
Amendment 416 #
Proposal for a regulation
Article 4 – paragraph 4
4. Upon request by the hosting service provider or by the content provider, the competent authority shall provide a detailed statement of reasons, without prejudice to the obligation of the hosting service provider to comply with the removal order within the deadline set out in paragraph 2.deleted
2019/02/25
Committee: LIBE
Amendment 426 #
Proposal for a regulation
Article 4 – paragraph 6
6. Hosting service providers shall acknowledge receipt and, without undue delay, inform the competent authority about the removal of terrorist content or disabling access to it, indicating, in particular,disabling of access to terrorist content, indicating the time of action, using the template set out in Annex II.
2019/02/25
Committee: LIBE
Amendment 429 #
Proposal for a regulation
Article 4 – paragraph 7
7. If the hosting service provider cannot comply with the removal order because of force majeure or of de facto impossibility not attributable to the hosting service provider, or for technical or operational reasons, it shall inform, without undue delay, the competent authority, explaining the reasons, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply aAs soon as the reasons invoked are no longer present, the hosting service provider shall comply with the order without undue delay.
2019/02/25
Committee: LIBE
Amendment 432 #
Proposal for a regulation
Article 4 – paragraph 8
8. If the hosting service provider cannot comply with the removal order because the removal order contains manifest errors or does not contain sufficient information to execute the order, it shall inform the competent authority without undue delay, asking for the necessary clarification, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the clarification is providedusing the template set out in Annex III.
2019/02/25
Committee: LIBE
Amendment 437 #
Proposal for a regulation
Article 4 – paragraph 9
9. The competent authority which issued the removal order shall inform the competent authority which oversees the implementation of proactive measures, referred to in Article 17(1)(c) when the removal order becomes final. A removal order becomes final where it has not been appealed within the deadline according to the applicable national law or where it has been confirmed following an appeal.
2019/02/25
Committee: LIBE
Amendment 447 #
Proposal for a regulation
Article 5
1. The competent authority or the relevant Union body may send a referral to a hosting service provider. 2. Hosting service providers shall put in place operational and technical measures facilitating the expeditious assessment of content that has been sent by competent authorities and, where applicable, relevant Union bodies for their voluntary consideration. 3. The referral shall be addressed to the main establishment of the hosting service provider or to the legal representative designated by the service provider pursuant to Article 16 and transmitted to the point of contact referred to in Article 14(1). Such referrals shall be sent by electronic means. 4. The referral shall contain sufficiently detailed information, including the reasons why the content is considered terrorist content, a URL and, where necessary, additional information enabling the identification of the terrorist content referred. 5. The hosting service provider shall, as a matter of priority, assess the content identified in the referral against its own terms and conditions and decide whether to remove that content or to disable access to it. 6. The hosting service provider shall expeditiously inform the competent authority or relevant Union body of the outcome of the assessment and the timing of any action taken as a result of the referral. 7. Where the hosting service provider considers that the referral does not contain sufficient information to assess the referred content, it shall inform without delay the competent authorities or relevant Union body, setting out what further information or clarification is required.Article 5 deleted Referrals
2019/02/25
Committee: LIBE
Amendment 467 #
Proposal for a regulation
Article 6 – paragraph 1
1. HWhere hosting service providers shall, where appropriate, take proactive measures to protect their services againslimit the dissemination of terrorist content. The measures shall be, they shall ensure that such measures are targeted, effective and proportionate, taking into account the risk and level of exposure to terrorist content, the fundamental rights of the users, and the fundamental importance of the right to freedom of expression and information in an open and democratic society.
2019/02/25
Committee: LIBE
Amendment 468 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
1a. Before introducing such measure, hosting service providers shall perform and make public a risk assessment on the level of exposure to terrorism content that is inter alia based on the number of removal orders and referrals received. Hosting service providers shall draw up a remedial action plan to tackle terrorist content proportionate to the level of risk identified.
2019/02/25
Committee: LIBE
Amendment 472 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – introductory part
Where it has been informed according to Article 4(9),The hosting service provider shall submit a report to the competent authority referred to in Article 17(1)(c) shall request the hosting service provider to submit a report, within three months after receipt of the request and thereafter, at least on an annual basis, on the specific proactive measures it has taken, including by using automated tools, with a view to:
2019/02/25
Committee: LIBE
Amendment 484 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – point a
(a) preventing the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist content;
2019/02/25
Committee: LIBE
Amendment 489 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – point b
(b) detecting, identifying and expeditiously removing or disabling access to terrorist content.
2019/02/25
Committee: LIBE
Amendment 491 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 2
Such a request shall be sent to the main establishment of the hosting service provider or to the legal representative designated by the service provider.deleted
2019/02/25
Committee: LIBE
Amendment 500 #
Proposal for a regulation
Article 6 – paragraph 3
3. Where the competent authority referred to in Article 17(1)(c) considers that the proactive measures taken and reported under paragraph 2 are insufficient in mitigating and managing the risk and level of exposure, it may request the hosting service provider to take specific additional proactive measures. For that purpose, tThe hosting service provider shall cooperate with the competent authority referred to in Article 17(1)(c) with a view to identifyingassessing the effectiveness and proportionality of the specific measures that the hosting service provider shalls put in place, establishing key objectives and benchmarks as well as timelines for their implementation.
2019/02/25
Committee: LIBE
Amendment 509 #
Proposal for a regulation
Article 6 – paragraph 4
4. Where no agreement can be reached within the three months from the request pursuant to paragraph 3, the competent authority referred to in Article 17(1)(c) may issue a decision imposing specific additional necessary and proportionate proactive measures. The decision shall take into account, in particular, the economic capacity of the hosting service provider and the effect of such measures on the fundamental rights of the users and the fundamental importance of the freedom of expression and information. Such a decision shall be sent to the main establishment of the hosting service provider or to the legal representative designated by the service provider. The hosting service provider shall regularly report on the implementation of such measures as specified by the competent authority referred to in Article 17(1)(c).deleted
2019/02/25
Committee: LIBE
Amendment 516 #
Proposal for a regulation
Article 6 – paragraph 5
5. A hosting service provider may, at any time, request the competent authority referred to in Article 17(1)(c) a review and, where appropriate, to revoke a request or decision pursuant to paragraphs 2, 3, and 4 respectively. The competent authority shall provide a reasoned decision within a reasonable period of time after receiving the request by the hosting service provider.deleted
2019/02/25
Committee: LIBE
Amendment 521 #
Proposal for a regulation
Article 7 – paragraph 1 – introductory part
1. Hosting service providers shall preserve terrorist content which has been removed or disabled as a result of a removal order, a referral or as a result of proactive measures pursuant to Articles 4, 5 and 6 and related data removed as a consequence of the removal of the terrorist content and which is necessary for:
2019/02/25
Committee: LIBE
Amendment 528 #
Proposal for a regulation
Article 7 – paragraph 1 – point a a (new)
(aa) complaint mechanisms pursuant to Article 10;
2019/02/25
Committee: LIBE
Amendment 530 #
Proposal for a regulation
Article 7 – paragraph 1 – point b
(b) the prevention, detection, investigation and prosecution of terrorist offences.deleted
2019/02/25
Committee: LIBE
Amendment 547 #
Proposal for a regulation
Article 8 – paragraph 1
1. HWhere hosting service providers shall set out in their terms and conditions theiradopt a policy to prevenlimit the dissemination of terrorist content, including, where appropriate,they shall set out in their terms and conditions a meaningful explanation of the functioning of proactive measures including the use of automated tools, and a clear information on how to access complaint procedures and to seek judicial redress.
2019/02/25
Committee: LIBE
Amendment 556 #
Proposal for a regulation
Article 8 – paragraph 3 – point b
(b) where applicable, information about the hosting service provider’s measures to prevent the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist contentproactive measures, including automated measures;
2019/02/25
Committee: LIBE
Amendment 562 #
Proposal for a regulation
Article 8 – paragraph 3 – point c
(c) number of pieces of terrorist content removed or to which access has been disabled, following removal orders, referrals, or proactive measures, respectively;
2019/02/25
Committee: LIBE
Amendment 563 #
Proposal for a regulation
Article 8 – paragraph 3 – point c a (new)
(ca) number of removal orders received and follow-up actions taken pursuant to Article 4(6), (7), and (8), respectively;
2019/02/25
Committee: LIBE
Amendment 568 #
Proposal for a regulation
Article 8 – paragraph 3 – point d
(d) overviewnumber and outcome of complaint procedures.
2019/02/25
Committee: LIBE
Amendment 569 #
Proposal for a regulation
Article 8 – paragraph 3 – point d a (new)
(da) number and outcome of actions for judicial redress.
2019/02/25
Committee: LIBE
Amendment 578 #
Proposal for a regulation
Article 9 – paragraph 1
1. Where hosting service providers use automated tools pursuant to this Regulation in respect of content that they store, they shall provide effective and appropriate safeguards to ensure that decisions taken concerning that content, in particular decisions to remove or disable content considered to be terrorist content, are accurate and well-founded, in particular taking into account the relevant context.
2019/02/25
Committee: LIBE
Amendment 581 #
Proposal for a regulation
Article 9 – paragraph 2
2. Safeguards shall consist, in particular, of human oversight and verifications where appropriate and, in any event, where a detailed assessment of the relevant context is required in order to determine whether or not the content is to be considered terrorist content.
2019/02/25
Committee: LIBE
Amendment 590 #
Proposal for a regulation
Article 10 – paragraph 1
1. Hosting service providers shall establish effective and accessible mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 6,proactive measures to submit a complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/25
Committee: LIBE
Amendment 595 #
Proposal for a regulation
Article 10 – paragraph 2
2. HUpon receipt of a complaint, hosting service providers shall promptly examine every complaintimmediately reinstate the content. Where, following a thorough examination of the complaint, they consider that they receive and reinstate the content without undue delay wheremoval or disabling of access was justified, they shall take a final decision confirming the removal or disabling of access was unjustified. T. In that case, they shall inform the complainant about the outcome of the examination and the possibilities to seek judicial redress.
2019/02/25
Committee: LIBE
Amendment 606 #
Proposal for a regulation
Article 11 – paragraph 2
2. Upon request of the content provider, tThe hosting service provider shall also inform the content provider about the reasons for the removal or disabling of access and possibilities to contest the decision.
2019/02/25
Committee: LIBE
Amendment 609 #
Proposal for a regulation
Article 11 – paragraph 3
3. The obligation pursuant to paragraphs 1 and 2 shall not apply where the competent authority decides that there should be no disclosure for reasons of public security, such asin order to avoid prejudicing the prevention, investigation, detection and prosecution of terrorist offences, for as long as necessary, but not exceeding [four] weeks from that decision. In such a case, the hosting service provider shall not disclose any information on the removal or disabling of access to terrorist content.
2019/02/25
Committee: LIBE
Amendment 610 #
Proposal for a regulation
Article 11 a (new)
Article 11 a Right to a judicial remedy 1. Member States shall provide for the possibility of seeking effective remedy against any decision taken in accordance to Article 4 and Article 10(2). This shall consist, in particular, of the possibility for hosting service providers as well as content providers to appeal such decisions before a court or a judicial authority other than the competent authority. 2. Where the hosting service provider or the content provider launch an appeal against a removal order or a proactive measure, the appealed order or decision shall be suspended until the final decision taken by the judicial authority.
2019/02/25
Committee: LIBE
Amendment 618 #
Proposal for a regulation
Article 13 – paragraph 1
1. Competent authorities in Member States shall inform, coordinate and cooperate with each other and, where appropriate, with relevant Union bodies such as Europol with regard to removal orders and referrals to avoid duplication, enhance coordination and avoid interference with investigations in different Member States.
2019/02/25
Committee: LIBE
Amendment 628 #
Proposal for a regulation
Article 13 – paragraph 3 – point b
(b) the processing and feedback relating to referrals pursuant to Article 5;deleted
2019/02/25
Committee: LIBE
Amendment 632 #
Proposal for a regulation
Article 13 – paragraph 3 – point c
(c) co-operation with a view to identify and implement proactive measures pursuant to Article 6.deleted
2019/02/25
Committee: LIBE
Amendment 634 #
Proposal for a regulation
Article 13 – paragraph 4
4. Where hosting service providers become aware of any evidence of terrorist offenterrorist content on their services, they shall promptly inform authorities competent for the investigation and prosecution in criminal offences in the concerned Member State or the point of contact in the Member State pursuant to Article 14(2), where they have their main establishment or a legal representative. Hosting service providers may, in case of doubt, transmit this information to Europol for appropriate follow up.
2019/02/25
Committee: LIBE
Amendment 640 #
Proposal for a regulation
Article 14 – paragraph 1
1. Hosting service providers shall establish a point of contact allowing for the receipt of removal orders and referrals by electronic means and ensure their swift processing pursuant to Articles 4 and 5. They shall ensure that this information is made publicly available.
2019/02/25
Committee: LIBE
Amendment 641 #
Proposal for a regulation
Article 14 – paragraph 2
2. The information referred to in paragraph 1 shall specify the official language or languages (s) of the Union, as referred to in Regulation 1/58, in which the contact point can be addressed and in which further exchanges in relation to removal orders and referrals pursuant to Articles 4 and 5 shall take place. This shall include at least one of the official languages of the Member State in which the hosting service provider has its main establishment or where its legal representative pursuant to Article 16 resides or is established.
2019/02/25
Committee: LIBE
Amendment 646 #
Proposal for a regulation
Article 14 – paragraph 3
3. Member States shall establish a point of contact to handle requests for clarification and feedback in relation to removal orders and referrals issued by them. Information about the contact point shall be made publicly available.
2019/02/25
Committee: LIBE
Amendment 648 #
Proposal for a regulation
Article 15 – paragraph 1
1. The Member State in which the main establishment of the hosting service provider is located shall have the jurisdiction for the purposes of Articles 64, 11a, 18, and 21. A hosting service provider which does not have its main establishment within one of the Member States shall be deemed to be under the jurisdiction of the Member State where the legal representative referred to in Article 16 resides or is established.
2019/02/25
Committee: LIBE
Amendment 655 #
Proposal for a regulation
Article 15 – paragraph 3
3. Where an authority of another Member State has issued a removal order according to Article 4(1), that Member State has jurisdiction to take coercive measures according to its national law in order to enforce the removal order.deleted
2019/02/25
Committee: LIBE
Amendment 659 #
Proposal for a regulation
Article 16 – paragraph 1
1. A hosting service provider which does not have an establishment in the Union but offers services in the Union, shall designate, in writing, a legal or natural person as its legal representative in the Union for the receipt of, compliance with and enforcement of removal orders, referrals, requests and decisions issued by the competent authorities on the basis of this Regulation. The legal representative shall reside or be established in one of the Member States where the hosting service provider offers the services.
2019/02/25
Committee: LIBE
Amendment 662 #
Proposal for a regulation
Article 16 – paragraph 2
2. The hosting service provider shall entrust the legal representative with the receipt, compliance and enforcement of the removal orders, referrals, requests and decisions referred to in paragraph 1 on behalf of the hosting service provider concerned. Hosting service providers shall provide their legal representative with the necessary powers and resource to cooperate with the competent authorities and comply with these decisions and orders.
2019/02/25
Committee: LIBE
Amendment 663 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Each Member State shall designate the independent and impartial public authority or authorities competent to
2019/02/25
Committee: LIBE
Amendment 672 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b) detect, identify and refer terrorist content to hosting service providers pursuant to Article 5;deleted
2019/02/25
Committee: LIBE
Amendment 684 #
Proposal for a regulation
Article 18 – paragraph 1 – point a
(a) Article 3(2) (hosting service providers’ terms and conditions);deleted
2019/02/25
Committee: LIBE
Amendment 687 #
Proposal for a regulation
Article 18 – paragraph 1 – point c
(c) Article 5(5) and (6) (assessment of and feedback on referrals);deleted
2019/02/25
Committee: LIBE
Amendment 693 #
Proposal for a regulation
Article 18 – paragraph 1 – point d
(d) Article 6(2) and (43) (reports on proactive measures and the adoption of measures following a decision imposing specific proactive measures);
2019/02/25
Committee: LIBE
Amendment 694 #
Proposal for a regulation
Article 18 – paragraph 1 – point e
(e) Article 7 (preservation of data);deleted
2019/02/25
Committee: LIBE
Amendment 696 #
Proposal for a regulation
Article 18 – paragraph 1 – point j
(j) Article 13 (4) (information on evidence of terrorist offences);deleted
2019/02/25
Committee: LIBE
Amendment 700 #
Proposal for a regulation
Article 18 – paragraph 3 – point a
(a) the nature, gravity, and duration of the breachfailure to comply with the obligations;
2019/02/25
Committee: LIBE
Amendment 701 #
Proposal for a regulation
Article 18 – paragraph 3 – point b
(b) the intentional or negligent character of the breachfailure to comply with the obligations;
2019/02/25
Committee: LIBE
Amendment 702 #
Proposal for a regulation
Article 18 – paragraph 3 – point c
(c) previous breachefailures to comply with the obligations by the legal person held responsible;
2019/02/25
Committee: LIBE
Amendment 712 #
Proposal for a regulation
Article 18 – paragraph 4
4. Member States shall ensure that a systematic failure to comply with obligations pursuant to Article 4(2) is subject to financial penalties of up to 4% of the hosting service provider’s global turnover of the last business year.deleted
2019/02/25
Committee: LIBE
Amendment 719 #
Proposal for a regulation
Article 21 – paragraph 1 – point a
(a) information about the number of removal orders and referrals issued, the number of pieces of terrorist content which has been removed or access to it disabled, including the corresponding timeframes pursuant to Articles 4 and 5;
2019/02/25
Committee: LIBE
Amendment 721 #
Proposal for a regulation
Article 21 – paragraph 1 – point b
(b) information about the specific proactive measures taken pursuant toreferred to in Article 6, including the amount of terrorist content which has been removed or access to it disabled and the corresponding timeframes;
2019/02/25
Committee: LIBE
Amendment 724 #
Proposal for a regulation
Article 21 – paragraph 1 – point d
(d) information about the number of redress procedures initiated pursuant to Article 11a and decisions taken by the competent authority in accordance with national law.
2019/02/25
Committee: LIBE