79 Amendments of Dominique BILDE related to 2018/0331(COD)
Amendment 58 #
Proposal for a regulation
Recital 2
Recital 2
(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online. Of particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity, in particular to plan attacks or organise the illegal sale of weapons.
Amendment 62 #
Proposal for a regulation
Recital 3
Recital 3
(3) The presence of terrorist content online has serious negative consequences for users, for citizens, the State and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terroristbeing used in any way by terrorists or for terrorist purposes and to help tackle terrorist content disseminated through their services.
Amendment 72 #
Proposal for a regulation
Recital 7
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded, but also taking into account the gravity of the threat posed by terrorism, which endangers the lives and physical integrity of European citizens, and the fact that a swift response to terrorism is essential in order to effectively prevent attacks. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information which is not of a terrorist nature, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
Amendment 80 #
Proposal for a regulation
Recital 8
Recital 8
(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order and with due respect for the conditions and legal deadlines stipulated by the courts of that State.
Amendment 84 #
Proposal for a regulation
Recital 9
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences underin Article 3 of Directive (EU) 2017/541 of the European Parliament and of the Council9. Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or, advocates, praises or justifies the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group or glorifies, praises or justifies such terrorist groups and / or the ideology claimed by a group identified as terrorist, i.e. in particular radical Islamist ideologies calling for jihad. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of personspurpose of the message, which can be deduced from its nature and wording and the context in which it was issued. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessm key element in this assessment, which should in principle be sufficient to characterise this message as terrorist content. Content disseminated for educational, journalistic or research purposes should be adequately protected and differs from a terrorist message by its purpose. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content in so far as it falls within the scope of the law and does not constitute a terrorist message within the meaning of this Regulation. _________________ 9Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
Amendment 95 #
Proposal for a regulation
Recital 11
Recital 11
(11) A substantial connection to the Union should be relevant to determine the scope of this Regulation. Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering goods or services. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from providing local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities towards one or more Member State as set out in Article 17(1)(c) of Regulation 1215/2012 of the European Parliament and of the Council10. On the other hand, provision of the service in view of mere compliance with the prohibition to discriminate laid down in Regulation (EU) 2018/302 of the European Parliament and of the Council11cannot, on that ground alone, be considered as directing or targeting activities towards a given territory within the Union. _________________ 10 Regulation (EU) 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1). 11 Regulation (EU) 2018/302 of the European Parliament and of the Council of 28 February 2018 on addressing unjustified geo-blocking and other forms of discrimination based on customers' nationality, place of residence or place of establishment within the internal market and amending Regulations (EC) No 2006/2004 and (EU) 2017/2394 and Directive 2009/22/EC (OJ L 601, 2.3.2018_________________ 10 Regulation (EU) 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L 351, 20.12.2012, p. 1).
Amendment 99 #
Proposal for a regulation
Recital 12
Recital 12
(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation within the meaning of Article 15 of Directive 2000/31/EC. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non- discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information, but with the diligence and speed that are necessary in view of the gravity of the terrorist threat.
Amendment 106 #
Proposal for a regulation
Recital 13
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities and the relevant legal remedies allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, corresponding to a dissemination rate of 30% within one hour, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or, if it cannot be removed immediately, access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union, but wherever possible, removal must be the preferred option.
Amendment 112 #
Proposal for a regulation
Recital 14
Recital 14
(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider easily and definitely to establish authenticity, including the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platforms or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 . _________________ 12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).
Amendment 115 #
Proposal for a regulation
Recital 15
Recital 15
(15) Referrals by the competent authorities or Europol constitute an effective and swift means of making hosting service providers aware of specific content on their services. This mechanism of alerting hosting service providers to information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility its own terms and conditions, should remain available in addition to removal orders. It is important that hosting service providers take into account and assess such referrals as a matter of priority and provide swift feedback about action taken. The ultimate decision about whether or not to remove the content because it is not compatible with their terms and conditions remains with the hosting service provider. In implementing this Regulation related to referrals, Europol’s mandate as laid down in Regulation (EU) 2016/79413 remains unaffected. _________________ 13Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).
Amendment 117 #
Proposal for a regulation
Recital 16
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the possible effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation and report back exhaustively on this matter to the competent authorities of the Member States. This requirement differs, however, from a general monitoring obligation, as set out in Article 15 of Directive 2000/31/EC. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, i may be considered as an indication of a low level of exposure to terrorist content.
Amendment 123 #
Proposal for a regulation
Recital 17
Recital 17
(17) When putting in place proactive measures, hosting service providers should ensure thtarget terrorist information as defined in the present Regulation; in doing so they should not therefore violate users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and speed and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means, such as algorithms to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights. Hosting service providers should report to the competent authorities designated by the Member States on the means deployed to ensure compliance with the above obligations, and in particular any means of detecting automated terrorist content that may be used. In this respect, it should be recalled that hosting service providers may not deploy automated means of detection in order to mask, block or reduce the scope of legal content which is not terrorist in nature.
Amendment 131 #
Proposal for a regulation
Recital 18
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services for terrorist purposes, the competent authorities designated by the Member States should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re- upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, and their economic capacity and the impact of its service in disseminating terror(in particular if an SME ist contentcerned) (for example, taking into account the number of users in the Union).
Amendment 134 #
Proposal for a regulation
Recital 19
Recital 19
(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives regarding public security while taking into account the gravity of the terrorist threat and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.
Amendment 139 #
Proposal for a regulation
Recital 21
Recital 21
(21) The obligation to preserve for a limited period the content for proceedings of administrative or judicial review, in particular procedures to obtain legal redress, is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity. Where companies remove material or disable access to it, in particular through their own proactive measures, and do not inform the relevant authority because they assess that it does not fall in the scope of Article 13(4) of this Regulation, law enforcement may be unaware of the existence of the content. Companies must therefore do everything necessary to inform the competent authorities of the Member States of all terrorist content that has been identified, blocked and removed, even if that content does not strictly fall within the scope of that article. Therefore, the preservation of content for purposes of prevention, detection, investigation and prosecution of terrorist offences is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can therefore contribute to prosecuting terrorist offences or to preventing serious risks to public securityconcerning content suspected of constituting terrorist content within the meaning of this Regulation.
Amendment 140 #
Proposal for a regulation
Recital 22
Recital 22
(22) To ensure proportionality, the period of preservation should be limited to six months to allow the content providers sufficient time to initiate the review process and to enable law enforcement access to relevant data for the investigation and prosecution of terrorist offences, and to allow scope for seeking legal redress. However, this period may be prolonged for the period that is necessary in case the review proceedings are initiated but not finalised within the six months period upon request by the authority carrying out the review. This duration should be sufficient to allow law enforcement authorities to preserve the necessary evidence in relation to investigations, while ensuring the balance with the fundamental rights concerned.
Amendment 145 #
Proposal for a regulation
Recital 25
Recital 25
(25) Complaint and redress procedures constitute a necessary safeguard against erroneous removal of content protected under the freedom of expression and information. Hosting service providers should therefore establish user-friendly complaint mechanisms and ensure that complaints are dealt with promptly and in full transparency towards the content provider. The requirement for the hosting service provider to reinstate the content where it has been removed in error, does not affect the possibility of hosting service providers to enforce their own terms and conditions on other grounds. Procedures for obtaining legal redress are open to hosting service providers in accordance with the applicable national law.
Amendment 147 #
Proposal for a regulation
Recital 26
Recital 26
(26) Effective legal protection according to Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union requires that persons are able to ascertain the reasons upon which the content uploaded by them has been removed or access to it disabled. For that purpose, the hosting service provider should make available to the content provider meaningful information enabling the content provider to contest the decision. However, this does not necessarily require a notification to the content provider. Depending on the circumstances, hosting service providers may replace content which is considered terrorist content, with a message that it has been removed or disabled in accordance with this Regulation. Further information about the reasons as well as possibilities for the content provider to contest the decision and seek legal redress should be given upon request, subject to any statutory confidentiality requirement or any restriction necessary in the context of anti-terrorism measures. Where competent authorities decide that for reasons of public security including in the context of an investigation, it is considered inappropriate or counter-productive to directly notify the content provider of the removal or disabling of content, they should inform the hosting service provider.
Amendment 149 #
Proposal for a regulation
Recital 27
Recital 27
(27) In order to avoid duplication and possible interferences with investigations, the competent authorities should inform, coordinate and cooperate with each other and, where appropriatethis is necessary and justified, with Europol when issuing removal orders or sending referrals to hosting service providers. In implementing the provisions of this Regulation, Europol could provide support in line with its current mandate and existing legal framework.
Amendment 152 #
Proposal for a regulation
Recital 28
Recital 28
(28) In order to ensure the effective and sufficiently coherent implementation of proactive measures, competent authorities in Member States shouldmay liaise with each other with regard to the discussions they have with hosting service providers as to the identification, implementation and assessment of specific proactive measures. Similarly, such cooperation ismay also be needed in relation to the adoption of rules on penalties, as well as the implementation and the enforcement of penalties.
Amendment 153 #
Proposal for a regulation
Recital 31
Recital 31
(31) Given the particular serious consequences of certain terrorist content, hosting service providers should promptly inform the authorities in the Member State concerned orand the competent authorities where they are established or have a legal representative, about the existence of any evidence of terrorist offences that they become aware of or any suspect item of content pointing to a possible terrorist offence. In order to ensure proportionality, this obligation is limited to terrorist offences as defined in Article 3(1) of Directive (EU) 2017/541. TAlthough the obligation to inform does not imply an obligation on hosting service providers to actively seek any such evidence, service providers must take every necessary step to inform the competent authorities of any item of content suspected of being linked to a terrorist offence. The Member State concerned is the Member State which has jurisdiction over the investigation and prosecution of the terrorist offences pursuant to Directive (EU) 2017/541 based on the nationality of the offender or of the potential victim of the offence or the target location of the terrorist act. In case of doubt, hosting service providers may transmit the information to Europol which should follow up according to its mandate, including forwarding to the relevant national authorities.
Amendment 155 #
Proposal for a regulation
Recital 33
Recital 33
(33) Both hosting service providers and Member States should establish points of contact to facilitate the swift handling of removal orders and referrals. In contrast to the legal representative, the point of contact serves operational purposes. The hosting service provider’s point of contact should consist of any dedicated means allowing for the electronic submission of removal orders and referrals and of technical and personal means allowing for the swift processing thereof. The point of contact for the hosting service provider does not have to be located in the Union and the hosting service provider isHosting service providers are free to nominate an existing point of contact, provided that this point of contact is able to fulfil all the functions provided foras set out in this Regulation. With a view to ensure that terrorist content is removed or access to it is disabled within one hour from the receipt of a removal order, hosting service providers should ensure that the point of contact is reachable 24/7genuinely and easily reachable 24/7 and can quickly provide an appropriate response to any request. The information on the point of contact should include information about the language in which the point of contact can be addressed. The point of contact should issue the acknowledgements of receipt or any other form of proof attesting to receipt of the electronic submission. In order to facilitate the communication between the hosting service providers and the competent authorities, hosting service providers are encouraged to allow for communication in one of the official languages of the Union in which their terms and conditions are available. Hosting service providers who are not established in the European Union should if at all possible have a point of contact which can deal with the requests referred to above in one of the main languages of communication of the European Union, in particular English, French or German. If at all possible, the point of contact should have an emergency number or emergency contact person.
Amendment 157 #
Proposal for a regulation
Recital 34
Recital 34
(34) In the absence of a general requirement for service providers to ensure a physical presence within the territory of the Union, there is a need to ensure clarity under which Member State's jurisdiction the hosting service provider offering services within the Union falls. As a general rule, the hosting service provider falls under the jurisdiction of the Member State in which it has its main establishment or its main centre of interest or in which it has designated a legal representative. Nevertheless, where another Member State issues a removal order, its authorities should be able to enforce their orders by taking coercive measures of a non-punitive nature, such as penalty payments. With regards to a hosting service provider which has no establishment in the Union and does not designate a legal representative, any Member State should, nevertheless, be able to issue penalties, provided that the principle of ne bis in idem is respected.
Amendment 164 #
Proposal for a regulation
Recital 37
Recital 37
(37) For the purposes of this Regulation, Member States should designate competent authorities. The requirement to designate competent authorities does not necessarily require the establishment of new authorities but can be existing bodies tasked with the functions set out in this Regulation. This Regulation requires designating authorities competent for issuing removal orders, referrals and for overseeing proactive measures and for imposing penalties. It is for Member States to decide how many authorities they wish to designate for these tasks and to allocate tasks among those authorities.
Amendment 166 #
Proposal for a regulation
Recital 38
Recital 38
(38) Penalties are necessary to ensure the effective implementation by hosting service providers of the obligations pursuant to this Regulation. Member States should adopt rules on penalties, including, where appropriate, fining guidelines. Particularly severe penalties shall be ascertained in the event that the hosting service provider systematically fails to remove terrorist content or disable access to it within one hour from receipt of a removal order. Penalties may also be imposed if the service provider has repeatedly and categorically refused to inform the competent authority designated by the Member State concerned of the proactive measures it has taken to combat terrorist content. Non-compliance in individual cases could be sanctioned while respecting the principles of ne bis in idem and of proportionality and ensuring that such sanctions take account of systematic failure. In order to ensure legal certainty, the rRegulation should set out to what extent the relevant obligations can be subject to penalties. Penalties for non- compliance with Article 6 should only be adopted in relation to obligations arising from a request to report pursuant to Article 6(2) or a decision imposing additional proactive measures pursuant to Article 6(4). When determining whether or not financial penalties should be imposed, due account should be taken of the financial resources of the provider with a view to imposing proportionate penalties, in particular if the provider is an SME. Member States shall ensure that penalties do not encourage the removal of content which is not terrorist content.
Amendment 170 #
Proposal for a regulation
Recital 39
Recital 39
(39) The use of standardised templates may facilitates cooperation and the exchange of information between competent authorities and service providers, allowing them to communicate more quickly and effectively. It is particularly important to ensure swift action following the receipt of a removal order. Templates may reduce translation costs and contribute to a high quality standard. Response forms similarly should allow for a standardised exchange of information, and this will be particularly important where service providers are unable to comply. Authenticated submission channels can guarantee the authenticity of the removal order, including the accuracy of the date and the time of sending and receipt of the order.
Amendment 171 #
Proposal for a regulation
Recital 40
Recital 40
(40) In order to allow for a swift amendment, where necessary, of the content of the templates to be used for the purposes of this Regulation the power to adopt acts in accordance with Article 290 of the Treaty on the Functioning of the European Union should be delegated to the Commission for a limited period to amend Annexes I, II and III of this Regulation. In order to be able to take into account the development of technology and of the related legal framework, the Commission should also be empowered to adopt delegated acts to supplement this Regulation with technical requirements for the electronic means to be used by competent authorities for the transmission of removal orders. It is of particular importance that the Commission carries out appropriate consultations during its preparatory work, including at expert level, and that those consultations are conducted in accordance with the principles laid down in the Interinstitutional Agreement of 13 April 2016 on Better Law-Making15. In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States' experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts. Parliament may also revoke the power to adopt delegated acts. _________________ 15 OJ L 123, 12.5.2016, p. 1.
Amendment 172 #
Proposal for a regulation
Recital 41
Recital 41
(41) Member States should collect information on the implementation of the legislation. A detailed programme for monitoring the outputs, results and impacts of this Regulation should be established in order to inform an evaluation of the legis as far as is possible and as far as is necessary to evaluate this Regulation.
Amendment 173 #
Proposal for a regulation
Recital 42
Recital 42
(42) Based on the findings and conclusions in the implementation report and the outcome of the monitoring exercise, the Commission should carry out an initial evaluation of this Regulation no sooner thanat the latest three years after its entry into force and further regular evaluations thereafter. The evaluation should be based on the five criteria of efficiency, effectiveness, relevance, coherence and EU added value. It will assess the functioning of the different operational and technical measures foreseen under the Regulation, including the effectiveness of measures to enhance the detection, identification and removal of terrorist content, the effectiveness of safeguard mechanisms as well as the impacts on potentially affected rightsfundamental rights, in particular freedom of expression, and interests of third parties, including a review of the requirement to inform content providers.
Amendment 175 #
Proposal for a regulation
Article premier – paragraph 1 – introductory part
Article premier – paragraph 1 – introductory part
1. This Regulation lays down uniform rules to prevent and penalise the misuse of hosting services for the dissemination of terrorist content online. It lays down in particular:
Amendment 178 #
Proposal for a regulation
Article premier – paragraph 1 – point a
Article premier – paragraph 1 – point a
(a) rules on duties of care to be applied by hosting service providers in order to prevent the dissemination of terrorist content through their services and ensure, where necessary, its swto ensure the expeditious removal of such content or, ift removal is not possible or relevant in a particular case, the blocking of the content;
Amendment 182 #
Proposal for a regulation
Article premier – paragraph 1 – point b
Article premier – paragraph 1 – point b
(b) a set of measures to be put in place by Member States to identifyprevent the dissemination of terrorist content, to enable its swift removal by hosting service providersor, if removal is not possible or relevant, its blocking by hosting service providers, to contribute to the needs of criminal investigations into terrorist acts and, in a general sense, to contribute to the fight against terrorism and to facilitate cooperation with the competent authorities in other Member States, hosting service providers and where appropriate, relevant Union bodies.
Amendment 199 #
Proposal for a regulation
Article 2 – paragraph 1 – point 3 – introductory part
Article 2 – paragraph 1 – point 3 – introductory part
(3) 'to offer services in the Union’ means: enabling legal or natural persons in one or more Member States to use the services of the hosting service provider which has a substantial connection to that Member State or Member States, such as
Amendment 200 #
Proposal for a regulation
Article 2 – paragraph 1 – point 3 – point b
Article 2 – paragraph 1 – point 3 – point b
(b) significant number of users in one or more Member States;
Amendment 208 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point a
Article 2 – paragraph 1 – point 5 – point a
(a) inciting or advocating the commission of terrorist offences, including by glorifying, the commission of terrorist offences, thereby causing a danger that such acts be committem or by justifying participation in jihad or Islamist terrorism, thereby causing a danger that such acts be committed, and in particular advocating groups recognised as terrorist groups, especially Islamist terrorist groups identified and classified as such by a Member State and which call for or glorify jihad;
Amendment 214 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point b
Article 2 – paragraph 1 – point 5 – point b
(b) encouraging the contribution to terrorist offences or justifying participation in jihad and Islamist terrorism;
Amendment 221 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point c
Article 2 – paragraph 1 – point 5 – point c
(c) promoting the activities of a terrorist group, especially jihad, in particular by encouraging the participation in or support to a terrorist group within the meaning of Article 2(3) of Directive (EU) 2017/541;
Amendment 226 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point d
Article 2 – paragraph 1 – point 5 – point d
(d) instructing on methods or techniques for the purpose of committing terrorist offences. or providing any other strategic information in connection with an organisation recognised as a terrorist one, especially information on departures abroad for the purpose of participating in jihad or undertaking training to this end;
Amendment 230 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 a (new)
Article 2 – paragraph 1 – point 5 a (new)
(5a) Content which aims to provide information on terrorism or denounce terrorist acts shall not be considered as terrorist content, even if it might contain images or describe events linked to terrorist crimes for the purpose of informing the public or for the authors to express themselves openly and especially in political terms;
Amendment 233 #
Proposal for a regulation
Article 2 – paragraph 1 – point 6
Article 2 – paragraph 1 – point 6
(6) ‘dissemination of terrorist content’ means makinguploading and making available terrorist content available to third parties on the hosting service providers’ services;
Amendment 237 #
Proposal for a regulation
Article 2 – paragraph 1 – point 8
Article 2 – paragraph 1 – point 8
(8) 'referral' means a notice by a competent authority appointed by a Member State or, where applicable, a relevant Union body to a hosting service provider about information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility with its own terms and conditions aimed to prevent dissemination of terrorism content;
Amendment 239 #
Proposal for a regulation
Article 2 – paragraph 1 – point 9
Article 2 – paragraph 1 – point 9
(9) ‘main establishment’ means the head office or registered office, defined as the centre of main interest, in other words within which the principal financial functions and operational control are exercised.
Amendment 243 #
Proposal for a regulation
Article 3 – paragraph 1
Article 3 – paragraph 1
1. Hosting service providers shall take appropriate, effective, reasonable and proportionate actions in accordance with this Regulation, against the dissemination of terrorist content and to protect users from terrorist content. In doing so, they shall act in a swift, diligent, proportionate and non- discriminatory manner, and with due regard to the fundamental rights of the users and take into account the fundamental importance of the freedom of expression and information in an open and democratic society and the seriousness of the terrorist threat, which affects the lives and physical integrity of individuals and/or the essential infrastructure of a country.
Amendment 262 #
Proposal for a regulation
Article 4 – paragraph 3 – point a
Article 4 – paragraph 3 – point a
(a) identification of the competent authority issuing the removal order and any means enabling authentication of the removal order by the competent authority;
Amendment 263 #
Proposal for a regulation
Article 4 – paragraph 3 – point b
Article 4 – paragraph 3 – point b
(b) aif possible, given the urgency of the order and the very short time for the service provider to take action, a brief statement of reasons explaining why the content is considered terrorist content, at least, byor a reference to the categories of terrorist content listed in Article 2(5);
Amendment 269 #
Proposal for a regulation
Article 4 – paragraph 3 – point f
Article 4 – paragraph 3 – point f
(f) information about the information or legal redress available to the hosting service provider and to the content provider;
Amendment 276 #
Proposal for a regulation
Article 4 – paragraph 4
Article 4 – paragraph 4
4. Upon request by the hosting service provider or by the content provider and subject to any legal confidentiality obligation in the context of the fight against terrorism, the competent authority shall provide a detailedfull statement of reasons, without prejudice to the obligation of the hosting service provider to comply with the removal order within the deadline set out in paragraph 2.
Amendment 278 #
Proposal for a regulation
Article 4 – paragraph 5
Article 4 – paragraph 5
5. The competent authorities shall address removal orders to the main establishment of the hosting service provider or to the legal representative designated by the hosting service provider pursuant to Article 16 and transmit it to the point of contact referred to in Article 14(1). Such orders shall be sent by electronic means capable of producing a written record under conditions allowing to establish the authentication of the sender, including the accuracy of the date and the time of sending and receipt of the order. To this end, the point of contact shall, where possible, provide information on such orders having been received and read, as well as an emergency contact number.
Amendment 281 #
Proposal for a regulation
Article 4 – paragraph 6
Article 4 – paragraph 6
6. Hosting service providers shall acknowledge receipt and, without undue delaythat such orders have been received and read and, as soon as possible, inform the competent authority about the removal of terrorist content or disabling access to it, where possible specifying why one of these solutions was preferred to the other, indicating, in particular, the time of action, using the template set out in Annex II.
Amendment 283 #
Proposal for a regulation
Article 4 – paragraph 7
Article 4 – paragraph 7
7. If the hosting service provider cannot comply with the removal order because of force majeure or of de facto impossibility not attributable to the hosting service provider, it shall inform, without undue delay, as defined by the case law of the relevant jurisdiction of the Member State concerned, it shall inform, as soon as possible, the competent authority, explaining the reasons, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the reasons invoked are no longer present.
Amendment 287 #
Proposal for a regulation
Article 4 – paragraph 8
Article 4 – paragraph 8
8. If the hosting service provider cannot comply with the removal order because the removal order contains manifest errors or does not contain sufficient information to execute the order, it shall inform the competent authority without undue delayby any means possible and as soon as possible, asking for the necessary clarification, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the clarification is provided.
Amendment 300 #
Proposal for a regulation
Article 5 – paragraph 5
Article 5 – paragraph 5
5. The hosting service provider shall, as a matter of priority, assess the content identified in the referral against its own terms and conditions and decide whether to remove that content or to disable access to it on the basis of objective criteria and with particular regard to the risk posed by the content.
Amendment 301 #
Proposal for a regulation
Article 5 – paragraph 6
Article 5 – paragraph 6
6. The hosting service provider shall expeditiously, as soon as possible, inform the competent authority or relevant Union body of the outcome of the assessment, the reason for the decision taken and the timing of any action taken as a result of the referral.
Amendment 302 #
Proposal for a regulation
Article 5 – paragraph 7
Article 5 – paragraph 7
7. Where the hosting service provider considers that the referral does not contain sufficient information to assess the referred content, it shall inform without delay the competent authorities or relevant Union body as soon as possible, setting out what further information or clarification is required.
Amendment 311 #
Proposal for a regulation
Article 6 – paragraph 3
Article 6 – paragraph 3
3. Where the competent authority referred to in Article 17(1)(c) considers that the proactive measures taken and reported under paragraph 2 are insufficient or inadequate in mitigating and managing the risk and level of exposure, it may request the hosting service provider to take specific additional proactive measures, insofar as such measures are proportionate to the service provider's exposure to terrorist risk. For that purpose, the hosting service provider shall cooperate with the competent authority referred to in Article 17(1)(c) with a view to identifying the specific measures that the hosting service provider shall put in place, establishing key objectives and benchmarks as well as timelines for their implementation. Similarly, the competent authority may alert the hosting service provider if the measures taken manifestly constitute an excessive infringement of fundamental rights and freedom of expression or fundamental rights in relation to the aim pursued and the degree of exposure to terrorist content, in particular with regard to the means of automatic content detection.
Amendment 314 #
Proposal for a regulation
Article 6 – paragraph 4
Article 6 – paragraph 4
4. Where no agreement can be reached within the three months from the request pursuant to paragraph 3, the competent authority referred to in Article 17(1)(c) may issue a decision imposing specific additional necessary and proportionate proactive measures. The decision shall take into account, in particular, the economic capacity of the hosting service provider, in particular if it is an SME, and the effect of such measures on the fundamental rights of the users and the fundamental importance of the freedom of expression and information, as well as the degree of exposure to terrorist content and the nature of such content. Such a decision shall be sent to the main establishment of the hosting service provider or to the legal representative designated by the service provider. The hosting service provider shall regularly report on the implementation of such measures as specified by the competent authority referred to in Article 17(1)(c).
Amendment 318 #
Proposal for a regulation
Article 7 – paragraph 1 – point a
Article 7 – paragraph 1 – point a
(a) proceedings of administrative or judicial review or, where appropriate, any judicial redress,
Amendment 319 #
Proposal for a regulation
Article 7 – paragraph 1 – point b
Article 7 – paragraph 1 – point b
(b) the prevention, detection, investigation and prosecution of terrorist offences, and in general the fight against terrorism.
Amendment 326 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Hosting service providers shall set out in their terms and conditions their policy to prevent the dissemination of terrorist content, including, where appropriate, a meaningful explanation of the functioning of proactive measures, including the use of particular in the case of the use of automated tools and algorithms, and the detailed functioning of such automated tools.
Amendment 335 #
Proposal for a regulation
Article 8 – paragraph 3 – point a
Article 8 – paragraph 3 – point a
(a) information about the hosting service provider’s measures in relation to the detection, identification and removal of terrorist content- or, if removal is not possible or appropriate, disabling - of terrorist content, specifying, as far as possible, why one solution has been chosen ahead of the other;
Amendment 338 #
Proposal for a regulation
Article 8 – paragraph 3 – point b
Article 8 – paragraph 3 – point b
(b) information about the hosting service provider’s measures to prevent the re-upload of content which has previously been removed or, where removal was not possible or appropriate, to which access has been disabled because it is considered to be terrorist content;
Amendment 340 #
Proposal for a regulation
Article 8 – paragraph 3 – point c
Article 8 – paragraph 3 – point c
(c) number of pieces of terrorist content removed or, where removal was not possible or appropriate, to which access has been disabled, following removal orders, referrals, or proactive measures, respectively;
Amendment 347 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Where hosting service providers use automated tools pursuant to this Regulation in respect of content that they store, they shall provide effective and appropriate safeguards to ensure that decisions taken concerning that content, in particular decisions to remove or, where removal is not possible or appropriate, disable content considered to be terrorist content, are accurate and well-founded.
Amendment 354 #
Proposal for a regulation
Article 9 – paragraph 2
Article 9 – paragraph 2
2. Safeguards shall consist, in particular, of human oversight and verifications where appropriate and, in any event, where a detailed assessment of the relevant context is required in order to determine whether or not the content is to be considered terrorist content.
Amendment 360 #
Proposal for a regulation
Article 10 – paragraph 1
Article 10 – paragraph 1
1. Hosting service providers shall establish effective and easily accessible mechanisms capable of providing a response within reasonable time limits allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 6, to submit a complaint against the action of the hosting service provider requesting reinstatement of the content.
Amendment 363 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. Hosting service providers shall promptly examine every complaint that they receive and reinstate the content without undue delay where the removal or disabling of access was unjustified. They shall inform the complainant about the outcome of the examinationIf the examination finds that the removal or disabling was unjustified, they shall inform the complainant accordingly. Where the examination concludes that the removal or disabling was in fact justified, the service provider shall assess the need to explain the reasons to the applicant, in particular with regard to any statutory confidentiality requirement in connection with the fight against terrorism.
Amendment 364 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
Article 10 – paragraph 2 a (new)
2a. The complaint mechanism referred to above shall be without prejudice to any judicial redress available to the applicant under the applicable national law.
Amendment 365 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
1. Where hosting service providers removed terrorist content or, where removal is not possible or appropriate, disable access to it, they shall make available to the content provider information on the removal or disabling of access to terrorist content.
Amendment 368 #
Proposal for a regulation
Article 11 – paragraph 2
Article 11 – paragraph 2
2. Upon request of the content provider, and subject to any statutory confidentiality requirement, the hosting service provider shall inform the content provider about the reasons for the removal or disabling of access and possibilities to contest the decision.
Amendment 373 #
Proposal for a regulation
Article 13 – paragraph 1
Article 13 – paragraph 1
1. CWhere necessary, the competent authorities in Member States shall inform, coordinate and cooperate with each other and, where appropriate, with relevant Union bodies such as Europol with regard to removal orders and referrals to avoid duplication, enhance coordination and avoid interference with investigations in different Member States.
Amendment 377 #
Proposal for a regulation
Article 13 – paragraph 4
Article 13 – paragraph 4
4. Where hosting service providers become aware of any evidence of terrorist offences or any suspicions of a possible link with terrorist offences, they shall promptly inform authorities competent for the investigation and prosecution in criminal offences in the concerned Member State or the point of contact in the Member State pursuant to Article 14(2), where they have their main establishment or a legal representative. Hosting service providers may, in case of doubt, transmit this information to Europol for appropriate follow up.
Amendment 380 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Hosting service providers shall establish a point of contact allowing for the receipt of removal orders and referrals by electronic means and ensure their swift processing pursuant to Articles 4 and 5. They shall ensure that this information is made publicly available. As far as possible, in particular with regard to their financial capacities, service providers shall provide an emergency number or a contact available for as many hours of the day as possible.
Amendment 382 #
Proposal for a regulation
Article 14 – paragraph 2
Article 14 – paragraph 2
2. The information referred to in paragraph 1 shall specify the official language or languages (s) of the Union, as referred to in Regulation 1/58, in which the contact point can be addressed and in which further exchanges in relation to removal orders and referrals pursuant to Articles 4 and 5 shall take place. This shall include at least one of the official languages of the Member State in which the hosting service provider has its main establishment or where its legal representative pursuant to Article 16 resides or is established and, where possible, one of the main languages of communication of the institutions of the European Union, namely English, German or French.
Amendment 385 #
Proposal for a regulation
Article 16 – paragraph 4
Article 16 – paragraph 4
4. The hosting service provider shall notify the competent authority referred to in Article 17(1)(d) in the Member State where the legal representative resides or is established about the designation. Information about the legal representative shall be publicly available and regularly updated. Under the same conditions, the hosting service provider shall inform the competent authority of any change of legal representative as soon as possible. For its part, the hosting provider shall regularly update the contact point information and inform the competent authority designated by the competent Member State of any changes as soon as possible.
Amendment 390 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) oversee the definition and implementation of proactive measures pursuant to Article 6;
Amendment 395 #
Proposal for a regulation
Article 17 – paragraph 1 a (new)
Article 17 – paragraph 1 a (new)
1a. Member States shall determine the allocation of the tasks referred to above between the competent authorities designated by them.
Amendment 403 #
Proposal for a regulation
Article 18 – paragraph 3 – point d
Article 18 – paragraph 3 – point d
(d) the financial strength of the legal person held liable, in particular in the case of an SME;
Amendment 410 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. The power to adopt delegated acts is conferred on the Commission for five years subject to the conditions laid down in this Article.
Amendment 412 #
Proposal for a regulation
Article 21 – paragraph 1 – introductory part
Article 21 – paragraph 1 – introductory part
1. Where possible, Member States shall collect from their competent authorities and the hosting service providers under their jurisdiction and send to the Commission every year by [31 March] the most relevant information about the actions they have taken in accordance with this Regulation. That information shall include: