BETA

149 Amendments of Eva JOLY related to 2018/0331(COD)

Amendment 39 #
Proposal for a regulation
Recital 1
(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventaddressing the misuse of hosting services for terrorist purposes and contributing to the investigation of criminal offences. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users’ trust in the online environment, and by strengthening safeguards to the freedom of expression and informationensure the rule of law and fundamental rights, in particular the freedom of expression and information, the right to freedom and pluralism of the media, the freedom to conduct a business and the rights to privacy and the protection of personal data.
2019/02/25
Committee: LIBE
Amendment 48 #
Proposal for a regulation
Recital 1 a (new)
(1a) Regulation of hosting service providers can only complement Member States’ strategies and actions to address illegal terrorist offences, which must emphasize offline measures, such as criminal investigations and cross-border cooperation, as well as preventive measures, including investment in education, social cohesion, and violence prevention, among others. As many studies show, the process of radicalisation very rarely happens online only. Actual violent radicalisation entails several complex processes, including person-to-person communication in conjunction with other offline factors. The role that internet and social media may play in this process should however not be undermined.
2019/02/25
Committee: LIBE
Amendment 58 #
Proposal for a regulation
Recital 3
(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and in proportion to the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminacompetent authorities to address terrorist offences committed through their services.
2019/02/25
Committee: LIBE
Amendment 64 #
Proposal for a regulation
Recital 4
(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need to be complemented by a clear legislative framework in order to further reduce accessibility to terrorist content online and, adequately address a rapidly evolving problemn evolving problem, and put in place necessary safeguards to ensure the rule of law and the protection of fundamental rights. This legislative framework seeks to build on and address some shortcomings of voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European Parliament to strengthen measures to tackle illegal and harmful contentcontent in line with the horizontal framework established by Directive 2000/31/EC and by the European Council to improve the automatic detection and removal of content that incites to terrorist actscontent. _________________ 7Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).
2019/02/25
Committee: LIBE
Amendment 66 #
Proposal for a regulation
Recital 5
(5) The applicis Regulation should lay down specific obligations of this Regulation should not affect the application ofcertain hosting service providers, and duties of care for those hosting service providers exposed to a significant amount of illegal terrorist content. The application of this Regulation should be without prejudice to Articles 14 and 15 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service providerthe liability exemption granted to hosting service providers should not be affected by any measures they take in compliance with this Regulation, including any proactiveadditional measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision, on the condition that they do not have actual knowledge of illegal activity or information or, upon obtaining such knowledge, they remove or disable access to that content expeditiously. As Article 15 of Directive 2000/31/EC prohibits general monitoring obligations on the information which they store as well as general obligations to actively seek facts or circumstances indicating illegal activity, this Regulation should not lead to information transmitted by competent authorities to hosting providers that is vague about the status of legality of the content notified. Where the hosting provider is not informed by the competent authority whether the content notified is considered to be illegal, it might risk facing liability for failing to act expeditiously to remove the content. Therefore, this information needs to be provided in any case by the competent authority. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met. _________________ 8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).
2019/02/25
Committee: LIBE
Amendment 74 #
Proposal for a regulation
Recital 6
(6) Rules to preventaddress the misuse of hosting services for the dissemination of terrorist content online in order to guarantee the smooth functioning of the internal market are set out in this Regulation in full respect of the rule of law and the fundamental rights protected in the Union’s legal order and notably those guaranteed in the Charter of Fundamental Rights of the European Union.
2019/02/25
Committee: LIBE
Amendment 77 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the rule of law and the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information and the rights to privacy and the protection of personal data, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures taken to remove terrorist content online constitutinge an interference in the freedom of expression and information, and therefore should be strictly targeted, necessary, appropriate and proportionate to help the fight against terrorism, including investigation and prosecution of terrorist offences, in the sense that they must serve to preventaddress the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
2019/02/25
Committee: LIBE
Amendment 88 #
Proposal for a regulation
Recital 8
(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order, and the possibility for content providers to contest the results of measures taken by the hosting provider.
2019/02/25
Committee: LIBE
Amendment 92 #
Proposal for a regulation
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and competent authorities shouldmight take to preventaddress the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing onthat is in line with the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity, should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content. _________________ 9Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
2019/02/25
Committee: LIBE
Amendment 102 #
Proposal for a regulation
Recital 10
(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which storeose main or one of the main purposes is to offer the storage of information provided by a recipient of the service at his or her request and in making the information stored available to third partiese public, and who has general control of and access to the content data stored and processed, irrespective of whether this activity is of a mere technical, automatic and passive nature, and irrespective of whether a payment of the user is required. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloudonline services to the extent they make the information available to third parties and websites where users can make comments or post reviewse public. The Regulation should also apply to hosting service providers established outside the Union but offering services to users who are within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services, and whether these services are specifically targeted at users in the Union. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation. By contrast, services which consist of providing mere technical facility such as ‘cloud services’ which consist in the provision of on demand physical or virtual resources that provide computing and storage infrastructure capabilities on which the service provider has no contractual rights as to what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no technical capability to remove specific content stored by their customers or the end-users of their customers, or services that consist of selling goods on- line, delivery of goods as such, or the provision of services off-line, or private websites, including blogs, should not be considered as hosting service providers within the scope of this Regulation. Mere conduits and other electronic communication services within the meaning of Directive xxx/2019 of the European Parliament and of the Council [European Electronic Communication Code] or providers of caching services, or other services provided in other layers of the Internet infrastructure, such as registries and registrars, DNS (domain name system) or adjacent services, such as payment services or DDoS (distributed denial of service) protection services should also not be understood as hosting service providers. The same is the case for Interpersonal communication services that enable direct interpersonal and interactive exchange of information between a finite number of persons, whereby the persons initiating or participating in the communication determine its recipient(s).
2019/02/25
Committee: LIBE
Amendment 114 #
Proposal for a regulation
Recital 12
(12) Hosting service proviWithout prejudice to Article 15 of Directive 2000/31/EC, hosting service providers that have been exposed to a significant number of uncontested removal orders should apply certain duties of care, in order to preventaddress the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non- discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information and freedom and pluralism of the media.
2019/02/25
Committee: LIBE
Amendment 125 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement orCompetent judicial authorities of the Member States should assess whether content is terrorist content, and whether to issue legal orders requesting hosting service providers to remove terrorist content or disable access to it. Member States should designate judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour fromexpeditiously after receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.
2019/02/25
Committee: LIBE
Amendment 131 #
Proposal for a regulation
Recital 14
(14) The competent authority should transmit the removal order directly to the addressee and point of contact by any electronic means capable of producing a written record under conditions that allow the service provider to establish authenticity, including the identity of the sender, the accuracy of the date and the time of sending and receipt of the order, such as by secured email and platformselectronically signed email or other secured channels, including those made available by the service provider, in line with the rules protecting personal data. This requirement may notably be met by the use of qualified electronic registered delivery services with qualified electronic signatures as provided for by Regulation (EU) 910/2014 of the European Parliament and of the Council12 . _________________ 12 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).
2019/02/25
Committee: LIBE
Amendment 136 #
Proposal for a regulation
Recital 15
(15) Referrals by the competent authorities or Europol constitute an effective and swift means of making hosting service providers aware of specific content on their services. This mechanism of alerting hosting service providers to information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility its own terms and conditions, should remain available in addition to removal orders. It is important that hosting service providers assess such referrals as a matter of priority and provide swift feedback about action taken. The ultimate decision about whether or not to remove the content because it is not compatible with their terms and conditions remains with the hosting service provider. In implementing this Regulation related to referrals, Europol’s mandate as laid down inEuropol as laid down in Article 4(1)(m) of Regulation (EU) 2016/79413 remains unaffected. _________________ 13Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).
2019/02/25
Committee: LIBE
Amendment 143 #
Proposal for a regulation
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services,The vast majority of hosting service providers are never exposed to illegal terrorist content. Given the complexity of effectively identifying and removing terrorist content at scale, and the potential impact on fundamental rights, duties of care that go beyond the mere removal of terrorist content online following removal orders from competent authorities could be taken by those hosting service providers that have been subject to a significant number of uncontested removal orders. Those hosting service providers should assess whether it is appropriate to take proactiveadditional measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, those hosting service providers should determine whaonly put appropriate, effective, necessary and proportionate proactive measure should be putadditional measures in place. This requirement should not imply an obligation of general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals by Europol addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.
2019/02/25
Committee: LIBE
Amendment 149 #
Proposal for a regulation
Recital 17
(17) When putting in place proactiveadditional measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - isas well as the rights to privacy and protection of personal data, are preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any final decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rightsremove or disable access to content should always be taken by a natural person. Any decision to use automated means should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights. In any case, hosting service providers should undertake a fundamental rights audit for any automated means for detecting terrorist content they use.
2019/02/25
Committee: LIBE
Amendment 155 #
Proposal for a regulation
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to preventaddress the misuse of their services, the competent authorities should requestmay recommend to hosting service providers having received a significant amount of removal orders, which hasve become final, to report on the proactiveadditional measures to be taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a resultthat employ the use of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verificatiliable technical tools to detect and identify new terrorist content, but that should leave the final decision to removal or disabling access to a decision by a natural person. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).
2019/02/25
Committee: LIBE
Amendment 165 #
Proposal for a regulation
Recital 19
(19) Following the requestcommendation, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactiveadditional measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactiveSuch measures should not, in principle, lead to the imposition of a general obligation to monitor uploaded content, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such drecisommendations, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information, the freedom of the media, the rights to privacy and to the protection of personal data, and the freedom to conduct a business, and provide appropriate justification.
2019/02/25
Committee: LIBE
Amendment 176 #
Proposal for a regulation
Recital 21
(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity, and for prosecuting and convicting terrorists. Where companies remove material or disable access to it, in particular through their own proactive measures, and do not inform the relevant authority because they assess that it does not fall in the scope of Article 13(4) of this Regulationadditional measures, law enforcement may be unaware of the existence of the content. Therefore, the preservation of content for purposes of prevention, detection, investigation and prosecution of terrorist offences, which should be initiated after the authorities have been informed pursuant to Article 13(4) of this Regulation, is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can therefore contribute to prosecuting terrorist offences or to preventing serious risks to public security.
2019/02/25
Committee: LIBE
Amendment 180 #
Proposal for a regulation
Recital 22
(22) To ensure proportionality, the period of preservation should be limited to six months to allow the content providers sufficient time to initiate the review process and to enable law enforcement access to relevant data for the investigation and prosecution of terrorist offences. However, this period may be prolonged for the period that is necessary in case the review proceedings are initiated but not finalised within the six months period upon request by the authority carrying out the review. This duration should be sufficient to allow law enforcement authorities to preserve the necessary evidence in relation to investigations, while ensuring the balance with the fundamental rights concerned. The preserved content and data should be erased after the end of this period.
2019/02/25
Committee: LIBE
Amendment 188 #
Proposal for a regulation
Recital 24
(24) Transparency of hosting service providers' policies in relation to terrorist content is essential to enhance their accountability towards their users and to reinforce trust of citizens in the Digital Single Market. Hosting service providers exposed to terrorist content should publish annual transparency reports containing meaningful information about action taken in relation to the detection, identification and removal of terrorist content.
2019/02/25
Committee: LIBE
Amendment 190 #
Proposal for a regulation
Recital 24 a (new)
(24 a) Content providers whose content has been removed following a removal order, should have a right to an effective remedy in accordance with Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union.
2019/02/25
Committee: LIBE
Amendment 191 #
Proposal for a regulation
Recital 25
(25) Complaint procedures constitute a necessary safeguard against erroneous removal of content protected under the freedom of expression and information. Hosting service providers should therefore establish user-friendly complaint mechanisms and ensure that complaints are dealt with promptly and in full transparency towards the content provider. The requirement for the hosting service provider to reinstate the content where it has been removed in error, does not affect the possibility of hosting service providers to enforce their own terms and conditions on other grounds. Member States should also guarantee that hosting service providers and content providers can effectively exercise their right to judicial redress. Furthermore, content providers whose content has been removed following a removal order should have the right to an effective judicial remedy in accordance with Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Effective appeal mechanisms should be established at national level to ensure that any party subject to a removal order issued by a competent judicial authority should have the right to appeal to a judicial body. The appeal procedure is without prejudice to the division of competences within national judicial systems.
2019/02/25
Committee: LIBE
Amendment 195 #
Proposal for a regulation
Recital 26
(26) Effective legal protection according to Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union requires that persons are able to ascertain the reasons upon which the content uploaded by them has been removed or access to it disabled. For that purpose, the hosting service provider should make available to the content provider meaningful information enabling the content provider to contest the decision. However, this does not necessarily require a notification to the content provider. Depending on the circumstances, hosting service providers may replace content which is considered terrorist content, with a message that it has been removed or disabled in accordance with this Regulation. Further information about the reasons as well as possibilities for the content provider to contest the decision should be given upon request. Where competent authorities decide that for reasons of public security including in the context of an investigation, it is considered inappropriate or counter-productive to directly notify the content provider of the removal or disabling of content, they should inform the hosting service provider.
2019/02/25
Committee: LIBE
Amendment 202 #
Proposal for a regulation
Recital 27
(27) In order to avoid duplication and possible interferences with investigations, the competent authorities should inform, coordinate and cooperate with each other and where appropriate with Europol when issuing removal orders or sending referrals to hosting service providers. In implementing the provisions of this Regulation, Europol could provide support in line with its current mandate and existing legal framework.
2019/02/25
Committee: LIBE
Amendment 207 #
Proposal for a regulation
Recital 28
(28) In order to ensure the effective and sufficiently coherent implementation of proactive measuremeasures by hosting service providers, competent authorities in Member States should liaise with each other with regard to the discussions they have with hosting service providers as to removal orders and the identification, implementation and assessment of specific proactiveadditional measures. Similarly, such cooperation is also needed in relation to the adoption of rules on penalties, as well as the implementation and the enforcement of penalties.
2019/02/25
Committee: LIBE
Amendment 212 #
Proposal for a regulation
Recital 29
(29) It is essential that the competent authority within the Member State responsible for imposing penalties is fully informed about the issuing of removal orders and referrals and subsequent exchanges between the hosting service provider and the relevant competent authority. For that purpose, Member States should ensure appropriate communication channels and mechanisms allowing the sharing of relevant information in a timely manner.
2019/02/25
Committee: LIBE
Amendment 214 #
Proposal for a regulation
Recital 30
(30) To facilitate the swift exchanges between competent authorities as well as with hosting service providers, and to avoid duplication of effort, Member States may make use of tools developed by Europol, such as the current Internet Referral Management application (IRMa) or successor tools or Eurojust.
2019/02/25
Committee: LIBE
Amendment 218 #
Proposal for a regulation
Recital 32
(32) The competent authorities in the Member States should be allowed to use such information to take investigatory measures available under Member State or Union law, including issuing a European Production Order under Regulation on European Production and Preservation Orders for electronic evidence in criminal matters14 . _________________ 14 COM(2018)225 final.
2019/02/25
Committee: LIBE
Amendment 221 #
Proposal for a regulation
Recital 33
(33) Both hosting service providers and Member States should establish points of contact to facilitate the swift handling of removal orders and referrals. In contrast to the legal representative, the point of contact serves operational purposes. The hosting service provider’s point of contact should consist of any dedicated means allowing for the electronic submission of removal orders and referrals and of technical and personal means allowing for the swift processing thereof. The point of contact for the hosting service provider does not have to be located in the Union and the hosting service provider is free to nominate an existing point of contact, provided that this point of contact is able to fulfil the functions provided for in this Regulation. With a view to ensure that terrorist content is removed or access to it is disabled within one hour from the receipt of a removal order, hosting service providers should ensure that the point of contact is reachable 24/7. The information on the point of contact should include information about the language in which the point of contact can be addressed. In order to facilitate the communication between the hosting service providers and the competent authorities, hosting service providers are encouraged to allow for communication in one of the official languages of the Union in which their terms and conditions are available.
2019/02/25
Committee: LIBE
Amendment 227 #
Proposal for a regulation
Recital 34
(34) In the absence of a general requirement for service providers to ensure a physical presence within the territory of the Union, there is a need to ensure clarity under which Member State's jurisdiction the hosting service provider offering services within the Union falls. As a general rule, the hosting service provider falls under the jurisdiction of the Member State in which it has its main establishment or in which it has designated a legal representative. Nevertheless, where another Member State issues a removal order, its authorities should be able to enforce their orders by taking coercive measures of a non-punitive nature, such as penalty payments. With regards to a hosting service provider which has no establishment in the Union and does not designate a legal representative, any Member State should, nevertheless, be able to issue penalties, provided that the principle of ne bis in idem is respected.
2019/02/25
Committee: LIBE
Amendment 228 #
Proposal for a regulation
Recital 35
(35) Those hosting service providers which are not established in the Union, should designate in writing a legal representative in order to ensure the compliance with and enforcement of the obligations under this Regulation. Hosting service providers may make use of an existing legal representative, provided that this legal representative is able to fulfil the functions as set out in this Regulation.
2019/02/25
Committee: LIBE
Amendment 233 #
Proposal for a regulation
Recital 37
(37) For the purposes of this Regulation, Member States should designate competent judicial authorities. The requirement to designate competent authorities does not necessarily require the establishment of new authorities but can be judicial existing bodies tasked with the functions set out in this Regulation. This Regulation requires designating authorities competent for issuing removal orders, referrals and for overseeing proactiveadditional measures and for imposing penalties. It is for Member States to decide how many authorities they wish to designate for these tasks.
2019/02/25
Committee: LIBE
Amendment 244 #
Proposal for a regulation
Recital 38
(38) Penalties are necessarycan contribute to ensureing the effective implementation by hosting service providers of the obligations pursuant to this Regulation. Member States should adopt rules on penalties, including, where appropriate, fining guidelines. Particularly severdissuasive penalties shall be ascertained in the event that the hosting service provider systematically fails to remove terrorist content or disable access to it within one hour fromafter receipt of a removal order. Non- compliance in individual cases could be sanctioned while respecting the principles of ne bis in idem and of proportionality and ensuring that such sanctions take account of systematic failure. In order to ensure legal certainty, the regulation should set out to what extent the relevant obligaand ongoing failure, gainful interest, and other factors alleviating or aggravating the failure to remove terrorist content. Sanctions can be subject to penalties. Penalties for non-compliance with Article 6 should only be adopted in relation to obligations arising from a request to report pursuant to Article 6(2) or a decision imposing additional proactive measures pursuant to Article 6(4)d possible penalties should not encourage the arbitrary removal by hosting service providers of content which is not terrorist content. In order to ensure legal certainty, the regulation should set out to what extent the relevant obligations can be subject to penalties. When determining whether or not financial penalties should be imposed, due account should be taken of the financial resources of the provider. Member States shall ensure that penalties do not encourage the removal of content which is not terrorist content.
2019/02/25
Committee: LIBE
Amendment 247 #
Proposal for a regulation
Recital 40
(40) In order to allow for a swift amendment, where necessary, of the content of the templates to be used for the purposes of this Regulation the power to adopt acts in accordance with Article 290 of the Treaty on the Functioning of the European Union should be delegated to the Commission to amend Annexes I, II and III of this Regulation. In order to be able to take into account the development of technology and of the related legal framework, the Commission should also be empowered to adopt delegated acts to supplement this Regulation with technical requirements for the electronic means to be used by competent authorities for the transmission of removal orders, and for determining what corresponds to a significant number of uncontested removal orders pursuant to this Regulation. It is of particular importance that the Commission carries out appropriate consultations during its preparatory work, including at expert level, and that those consultations are conducted in accordance with the principles laid down in the Interinstitutional Agreement of 13 April 2016 on Better Law-Making15 . In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States' experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts. _________________ 15 OJ L 123, 12.5.2016, p. 1.
2019/02/25
Committee: LIBE
Amendment 248 #
Proposal for a regulation
Recital 41
(41) Member States should collect information on the implementation of the legislation including information on the number of cases of successful detection, investigation and prosecution of terrorist offences as a consequence of this Regulation. A detailed programme for monitoring the outputs, results and impacts of this Regulation should be established in order to inform an evaluation of the legislation.
2019/02/25
Committee: LIBE
Amendment 249 #
Proposal for a regulation
Recital 42
(42) Based on the findings and conclusions in the implementation report and the outcome of the monitoring exercise, the Commission should carry out an evaluation of this Regulation no sooner than three years after its entry into force. The evaluation should be based on the five criteria of efficiency, effectiveness, relevance, coherence and EU added value. It will assess the functioning of the different operational and technical measures foreseen under the Regulation, including the effectiveness of measures to enhance the detection, identification and removal of terrorist content, the effectiveness of safeguard mechanisms as well as the impacts on potentially affected rights and interests of third parties, including a review of the requirement to inform content providersfundamental rights, including the freedom of expression and information, the right to freedom and pluralism of the media, the freedom to conduct a business and the rights to privacy and protection of personal data.
2019/02/25
Committee: LIBE
Amendment 252 #
Proposal for a regulation
Recital 43
(43) Since the objective of this Regulation, namely ensuring the smooth functioning of the digital single market by preventcontributing to the investigation of terrorist offences and addressing the dissemination of terrorist content online, cannot be sufficiently achieved by the Member States and can therefore, by reason of the scale and effects of the limitation, be better achieved at Union level, the Union may adopt measures, in accordance with the principle of subsidiarity as set out in Article 5 of the Treaty on European Union. In accordance with the principle of proportionality, as set out in that Article, this Regulation does not go beyond what is necessary in order to achieve that objective,
2019/02/25
Committee: LIBE
Amendment 258 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down uniform rules to preventaddress the misuse of hosting services for the dissemination of terrorist content online. It lays down in particular:
2019/02/25
Committee: LIBE
Amendment 265 #
Proposal for a regulation
Article 1 – paragraph 1 – point a
(a) rules on duties of care to be applied by hosting service providers in order to prevent the dissemination ofthat are particularly exposed to terrorist content through their services and, in order to ensure, where necessary, its swift removal;
2019/02/25
Committee: LIBE
Amendment 275 #
Proposal for a regulation
Article 1 – paragraph 2 a (new)
2 a. This Regulation does not apply to content which is disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity.
2019/02/25
Committee: LIBE
Amendment 282 #
Proposal for a regulation
Article 1 – paragraph 2 b (new)
2 b. This Regulation is without prejudice to the liability regime under Directive 2000/31/EC.
2019/02/25
Committee: LIBE
Amendment 288 #
Proposal for a regulation
Article 2 – paragraph 1 – point 1
(1) 'hosting service provider' means a provider of information society services consisting inwhose main or one of the main purposes is to offer the storage of information provided by and at the request of the content provider and in making the information stored available to third partiese public, and who has general control of and access to the content data stored and processed;
2019/02/25
Committee: LIBE
Amendment 295 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2
(2) 'content provider' means a user who has provided informationcontent data that is, or that has been, stored and made available to the public at the request of the user by a hosting service provider;
2019/02/25
Committee: LIBE
Amendment 298 #
Proposal for a regulation
Article 2 – paragraph 1 – point 3 – introductory part
(3) 'to offer services in the Union’ means: enabling legal or natural persons in one or more Member States to use the services of the hosting service provider, irrespective of whether a payment of the user is required or not, which has a substantial connection to that Member State or Member States, such as
2019/02/25
Committee: LIBE
Amendment 300 #
Proposal for a regulation
Article 2 – paragraph 1 – point 3 – point c
(c) targeting of activities towards users in one or more Member States.
2019/02/25
Committee: LIBE
Amendment 305 #
Proposal for a regulation
Article 2 – paragraph 1 – point 4
(4) 'terrorist offences' means offences as defined in Article 3(1) of Directive (EU) 2017/541;
2019/02/25
Committee: LIBE
Amendment 310 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – introductory part
(5) 'terrorist content' means one or more of the following information:manifestly illegal information qualifying as one or more of the offences defined in Articles 5 to 7 of Directive 2017/541 on combating terrorism.
2019/02/25
Committee: LIBE
Amendment 315 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point a
(a) inciting or advocating, including by glorifying, the commission of terrorist offences, thereby causing a danger that such acts be committed;deleted
2019/02/25
Committee: LIBE
Amendment 321 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point b
(b) encouraging the contribution to terrorist offences;deleted
2019/02/25
Committee: LIBE
Amendment 327 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point c
(c) promoting the activities of a terrorist group, in particular by encouraging the participation in or support to a terrorist group within the meaning of Article 2(3) of Directive (EU) 2017/541;deleted
2019/02/25
Committee: LIBE
Amendment 336 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point d
(d) instructing on methods or techniques for the purpose of committing terrorist offences.deleted
2019/02/25
Committee: LIBE
Amendment 348 #
Proposal for a regulation
Article 2 – paragraph 1 – point 6
(6) ‘dissemination of terrorist content’ means making terrorist content available to third partiese public on the hosting service providers’ services;
2019/02/25
Committee: LIBE
Amendment 359 #
Proposal for a regulation
Article 2 – paragraph 1 – point 8
(8) 'referral' means a notice by a competent authority or, where applicable, a relevant Union body to a hosting service provider about information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility with its own terms and conditions aimed to prevent dissemination of terrorism content;deleted
2019/02/25
Committee: LIBE
Amendment 361 #
Proposal for a regulation
Article 2 – paragraph 1 – point 9 a (new)
(9a) ‘competent authority’ means an independent judicial authority designated or created by the Member State.
2019/02/25
Committee: LIBE
Amendment 365 #
Proposal for a regulation
Article 3 – paragraph 1
1. Hosting service providers that have been subject to a significant number of uncontested removal orders shall take appropriate, reasonable and proportionate actions in accordance with this Regulation, against the dissemination of terrorist content and to protect users from terrorist content. In doing so, they shall act in a diligent, proportionate and non- discriminatory manner, and with due regard in all circumstances to the fundamental rights of the users and take into account the fundamental importance of the freedom of expression and information in an open and democratic society. These duties of care shall not result in a general monitoring obligation of the content which hosting service providers make available to the public.
2019/02/25
Committee: LIBE
Amendment 374 #
Proposal for a regulation
Article 3 – paragraph 2
2. Hosting service providers shall include in their terms and conditions, and apply, provisions to prevent the dissemination of terrorist contenthat inform their users about the rules relating to terrorist content pursuant to this Regulation.
2019/02/25
Committee: LIBE
Amendment 384 #
Proposal for a regulation
Article 4 – paragraph 1
1. The competent authority shall have the power to issue a decisionremoval order requiring the hosting service provider to remove terrorist content or disable access to it..
2019/02/25
Committee: LIBE
Amendment 390 #
Proposal for a regulation
Article 4 – paragraph 2
2. Hosting service providers shall remove terrorist content or disable access to it within one hour fromexpeditiously and as soon as possible after receipt of the removal order, taking into account the hosting provider’s size and resources.
2019/02/25
Committee: LIBE
Amendment 399 #
Proposal for a regulation
Article 4 – paragraph 3 – point b
(b) a detailed statement of reasons explaining why the content is considered terrorist content, at least, by reference to the categories of terrorist content listed in Article 2(5) and substantiating the elements of unlawfulness and intentionality and the relevant national law;
2019/02/25
Committee: LIBE
Amendment 406 #
Proposal for a regulation
Article 4 – paragraph 3 – point e a (new)
(ea) a qualified electronic signature of the issuing authority, pursuant to Regulation (EU) 910/20141a; _________________ 1aRegulation (EU) 910/2014 of the European Parliament and of the Council of of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73.)
2019/02/25
Committee: LIBE
Amendment 408 #
Proposal for a regulation
Article 4 – paragraph 3 – point f
(f) information about redress available to the hosting service provider and to the content provider, including redress with the competent authority as well as recourse to a court;
2019/02/25
Committee: LIBE
Amendment 410 #
Proposal for a regulation
Article 4 – paragraph 3 – point g
(g) where relevantnecessary and proportionate, the decision not to disclose information about the removal of terrorist content or the disabling of access to it referred to in Article 11.;
2019/02/25
Committee: LIBE
Amendment 414 #
Proposal for a regulation
Article 4 – paragraph 3 – point g a (new)
(ga) deadlines for appeal for the hosting service provider and for the content provider.
2019/02/25
Committee: LIBE
Amendment 415 #
Proposal for a regulation
Article 4 – paragraph 4
4. Upon request by the hosting service provider or by the content provider, the competent authority shall provide a detailed statement of reasons, without prejudice to the obligation of the hosting service provider to comply with the removal order within the deadline set out in paragraph 2.deleted
2019/02/25
Committee: LIBE
Amendment 423 #
Proposal for a regulation
Article 4 – paragraph 5
5. The competent authorities shall address removal orders to the main establishment of the hosting service provider or to the legal representative designated by the hosting service provider pursuant to Article 16 and transmit it to the point of contact referred to in Article 14(1). Such orders shall be sent by electronic means capable of producing a written record under conditions allowing to establish the authentication of the sender, including the accuracy of the date and the time of sending and receipt of the order, pursuant to point (ea) of paragraph 3.
2019/02/25
Committee: LIBE
Amendment 430 #
Proposal for a regulation
Article 4 – paragraph 7
7. If the hosting service provider cannot comply with the removal order because of force majeure or of de facto impossibility not attributable to the hosting service provider, it shall inform, without undue delay, the competent authority, explaining the reasons, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the reasons invoked are no longer present.
2019/02/25
Committee: LIBE
Amendment 434 #
8. If the hosting service provider cannotrefuses to comply with the removal order because the removal order contains manifest errors, does not sufficiently establish the illegality of the content, or does not contain sufficient information to execute the order, it shall inform the competent authority without undue delay, asking for the necessary clarification, using the template set out in Annex III. The deadline set out in paragraph 2In such cases, the competent authority shall apreply as soon as the clarification is providedpromptly.
2019/02/25
Committee: LIBE
Amendment 440 #
Proposal for a regulation
Article 4 – paragraph 9
9. The competent authority which issued the removal order shall inform the competent authority which oversees the implementation of proactive measuresadditional measures pursuant to Article 6, referred to in Article 17(1)(c) when the removal orders that have becomes final for a specific hosting provider reach a significant number. A removal order becomes final where it has not been appealed and judicial redress has not been sought within the deadline according to the applicable national law or where it has been confirmed following an appeal.
2019/02/25
Committee: LIBE
Amendment 446 #
Proposal for a regulation
Article 5
1. The competent authority or the relevant Union body may send a referral to a hosting service provider. 2. Hosting service providers shall put in place operational and technical measures facilitating the expeditious assessment of content that has been sent by competent authorities and, where applicable, relevant Union bodies for their voluntary consideration. 3. The referral shall be addressed to the main establishment of the hosting service provider or to the legal representative designated by the service provider pursuant to Article 16 and transmitted to the point of contact referred to in Article 14(1). Such referrals shall be sent by electronic means. 4. The referral shall contain sufficiently detailed information, including the reasons why the content is considered terrorist content, a URL and, where necessary, additional information enabling the identification of the terrorist content referred. 5. The hosting service provider shall, as a matter of priority, assess the content identified in the referral against its own terms and conditions and decide whether to remove that content or to disable access to it. 6. The hosting service provider shall expeditiously inform the competent authority or relevant Union body of the outcome of the assessment and the timing of any action taken as a result of the referral. 7. Where the hosting service provider considers that the referral does not contain sufficient information to assess the referred content, it shall inform without delay the competent authorities or relevant Union body, setting out what further information or clarification is required.Article 5 deleted Referrals
2019/02/25
Committee: LIBE
Amendment 460 #
Proposal for a regulation
Article 6 – title
6 Proactive measuresAdditional measures (This amendment applies throughout the text.)
2019/02/25
Committee: LIBE
Amendment 463 #
Proposal for a regulation
Article 6 – paragraph 1
1. Hosting service providers shallmay, where appropriate, take proactivethey have been subject to a significant number of uncontested removal orders, take additional measures to protect their services against the dissemination of terrorist content. The measures shall be effective, targeted and proportionate, taking into accounto the risk and level of exposure to terrorist content, the fundamental rights of the users, and the fundamental importance of the freedom of expression and information in an open and democratic societyand rights to privacy and personal data protection in an open and democratic society. The measures shall not result in any general monitoring of the content which hosting service providers make available to the public, nor to the automated removal of content without human intervention.
2019/02/25
Committee: LIBE
Amendment 475 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – introductory part
Where it has been informed according to Article 4(9), the competent authority referred to in Article 17(1)(c) shallmay request the hosting service provider to submit a report, within three months after receipt of the request and, if necessary, thereafter at least on an annual basis, on the specific proactiveadditional measures it has taken, including by using automated tools, with a view to:.
2019/02/25
Committee: LIBE
Amendment 478 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – point a
(a) preventing the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist content;deleted
2019/02/25
Committee: LIBE
Amendment 487 #
(b) detecting, identifying and expeditiously removing or disabling access to terrorist content.deleted
2019/02/25
Committee: LIBE
Amendment 495 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 3
The reports shall include all relevant information allowing the competent authority referred to in Article 17(1)(c) to assess whether the proactiveadditional measures are effective and proportiproportionate and effectively contribute to addressing terrorist conatent online, including toan evaluate the functioning of any automated tools used as well as the human oversight and verification mechanisms employion of the nature and functioning measures it has taken, as well as information on the number of reinstated content and the human oversight, review mechanisms available to individuals and any verification mechanisms used to assess the illegality of the terrorist content removed or whose access has been disabled.
2019/02/25
Committee: LIBE
Amendment 501 #
Proposal for a regulation
Article 6 – paragraph 3
3. Where the competent authority referred to in Article 17(1)(c) considers that the proactiveadditional measures taken and reported under paragraph 2 are insufficient in mitigating and managing the risk and level of exposurenot reducing the risk and level of exposure, or do not respect the principles of necessity and proportionality, it may request the hosting service provider to take specific additional proactivre-assess the measures needed. For that purpose, the hosting service provider shall cooperate with the competent authority referred to in Article 17(1)(c) with a view to identifying the specific measures that the hosting service provider shall consider to put in place, establishingincluding suggestions for key objectives and benchmarks, as well as timelines for their implementand taking into account, in particular, the economic capacity of the hosting service provider and the effect of such measures on the fundamental rights of the users and the fundamental importance of the freedom of expression and information, as well as rights to privacy and personal data protection.
2019/02/25
Committee: LIBE
Amendment 505 #
Proposal for a regulation
Article 6 – paragraph 4
4. Where no agreement can be reached within the three months from the request pursuant to paragraph 3, the competent authority referred to in Article 17(1)(c) may issue a decision imposing specific additional necessary and proportionate proactive measures. The decision shall take into account, in particular, the economic capacity of the hosting service provider and the effect of such measures on the fundamental rights of the users and the fundamental importance of the freedom of expression and information. Such a decision shall be sent to the main establishment of the hosting service provider or to the legal representative designated by the service provider. The hosting service provider shall regularly report on the implementation of such measures as specified by the competent authority referred to in Article 17(1)(c).deleted
2019/02/25
Committee: LIBE
Amendment 520 #
Proposal for a regulation
Article 6 – paragraph 5
5. A hosting service provider may, at any time, request the competent authority referred to in Article 17(1)(c) a review and, where appropriate, to revoke a request or decision pursuant to paragraphs 2, 3, and 4 respectively3. The competent authority shall provide a reasoned decision within a reasonable period of time after receiving the request by the hosting service provider.
2019/02/25
Committee: LIBE
Amendment 523 #
Proposal for a regulation
Article 7 – paragraph 1 – introductory part
1. Hosting service providers shall preserve terrorist content which has been removed or disabled as a result of a removal order, a referral or as a result of proactiveadditional measures pursuant to Articles 4, 5 and 6 and related user data removed as a consequence of the removal of the terrorist content and which is necessary for:
2019/02/25
Committee: LIBE
Amendment 533 #
Proposal for a regulation
Article 7 – paragraph 1 – point b a (new)
(ba) remedying complaints following the mechanism described in Article 10.
2019/02/25
Committee: LIBE
Amendment 534 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
1a. The obligation in paragraph 1 shall also apply when hosting service providers remove content as a consequence of a referral by Europol within the meaning of Article 4(1) (m) of Regulation (EU) 2016/794.
2019/02/25
Committee: LIBE
Amendment 535 #
Proposal for a regulation
Article 7 – paragraph 2
2. The terrorist content and related user data referred to in paragraph 1 shall be preserved for six months. The terrorist content shall, upon request from the competent authority or court, and shall be erased thereafter. The terrorist content shall be preserved for a longer period when and for as long as necessary for ongoing proceedings of: (a) administrative or judicial review referred to in paragraph 1(a), upon request from the competent authority or court; (b) the prevention, detection, investigation and prosecution of terrorist offences referred to in paragraph 1(b), upon request from the prosecutor or judge leading the respective criminal proceedings, (c) complaints referred to in paragraph 1(c), upon request of the complaint body of or acting on behalf of the hosting service provider.
2019/02/25
Committee: LIBE
Amendment 543 #
Proposal for a regulation
Article 7 – paragraph 3 a (new)
3a. Member States shall provide in national legislation that except in cases of validly established urgency, access to terrorist content and related user data preserved for any of the purposes under point (b) of paragraph 1 shall be authorised only after a prior review by a court or an investigating judge.
2019/02/25
Committee: LIBE
Amendment 544 #
Proposal for a regulation
Article 8 – title
Transparency obligations for hosting service providers
2019/02/25
Committee: LIBE
Amendment 546 #
Proposal for a regulation
Article 8 – paragraph 1
1. Hosting service providers shall set outexplain in clear manner in their terms and conditions their policy to prevent the dissemination ofand specific measures with regard to terrorist content, including, where appropriate, a meaningful explanation of the functioning of proactive measures including the use of automated toolsadditional measures, as well as a description of the complaint and arbitration mechanism available for content providers in accordance with Article 10.
2019/02/25
Committee: LIBE
Amendment 550 #
Proposal for a regulation
Article 8 – paragraph 2
2. Hosting service providers exposed to terrorist content that have received removal orders in a given year shall publish annual transparency reports on action taken against the dissemination of terrorist content for those years.
2019/02/25
Committee: LIBE
Amendment 558 #
Proposal for a regulation
Article 8 – paragraph 3 – point b
(b) information about the hosting service provider’s measures to preventaddress the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist content;
2019/02/25
Committee: LIBE
Amendment 561 #
Proposal for a regulation
Article 8 – paragraph 3 – point c
(c) number of pieces of terrorist content removed or to which access has been disabled, following removal orders, referrals, or proactive or additional measures, respectively;
2019/02/25
Committee: LIBE
Amendment 564 #
Proposal for a regulation
Article 8 – paragraph 3 – point c a (new)
(ca) number of pieces of alleged terrorist content which had to be made available again following a complaint or a redress;
2019/02/25
Committee: LIBE
Amendment 565 #
Proposal for a regulation
Article 8 – paragraph 3 – point c b (new)
(cb) number of pieces of alleged terrorist content which were not removed pursuant to paragraphs 7 and 8 of Article 4, and the grounds for not removing them;
2019/02/25
Committee: LIBE
Amendment 571 #
Proposal for a regulation
Article 8 a (new)
Article 8 a Transparency obligations for competent authorities 1. Competent authorities shall publish annual transparency reports on removal orders and follow-up taken regarding terrorist content. 2. Transparency reports shall include at least the following information: (a) information about the competent authority’s measures in relation to the detection, identification and removal of terrorist content; (b) information about the competent and other authorities’ measures to prosecute the content providers or other persons, where applicable, following the removal or disabling of access of terrorist content; (c) number of pieces of terrorist content removed or to which access has been disabled, following removal orders, referrals pursuant to Article 4(1) (m) of Regulation (EU) 2016/794, and additional measures, respectively; (d) number of removals that have led to the successful detection, investigation and prosecution of terrorist offences; (e) number of pieces of alleged terrorist content which had to be made available again following a redress; (f) number of pieces of alleged terrorist content which were not removed pursuant to paragraphs 7 and 8 of Article 4, and the grounds for not removing them (g) overview and outcome of redress procedures.
2019/02/25
Committee: LIBE
Amendment 572 #
Proposal for a regulation
Article 9 – title
Safeguards regarding the use and implementation of proactive measurescontent removal
2019/02/25
Committee: LIBE
Amendment 575 #
Proposal for a regulation
Article 9 – paragraph 1
1. Where hosting service providers use automated toolmeasures that go beyond their obligations pursuant to this Regulation in respect of content that they store, they shall provide effective and appropriate safeguards to ensure that decisions taken concerning that content, in particular decisions to remove or disable content considered to be terrorist content, are accurate and well-founded and do not lead to the removal of or disabling access to legal content.
2019/02/25
Committee: LIBE
Amendment 583 #
Proposal for a regulation
Article 9 – paragraph 2
2. Safeguards shall consist, in particular, of human oversight and verifications where appropriate and, in any event, where a detailed assessment of the relevant context is required in order to determine whether or not the content is to be considered terrorist content, and of easily accessible complaint mechanisms.
2019/02/25
Committee: LIBE
Amendment 585 #
Proposal for a regulation
Article 9 – paragraph 2 a (new)
2a. Content providers, whose content has been removed or access to which has been disabled following a removal order, shall have a right to an effective remedy. Member States shall put in place effective procedures for exercising this right.
2019/02/25
Committee: LIBE
Amendment 587 #
Proposal for a regulation
Article 10 – paragraph 1
1. Hosting service providers shall establish effective and easily accessible mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referralspecific additional measures pursuant to Article 56, or of proactive measures pursuant to Article 6a referral by Europol within the meaning of Article 4(1) (m) of Regulation (EU) 2016/794, to submit a complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/25
Committee: LIBE
Amendment 592 #
Proposal for a regulation
Article 10 – paragraph 2
2. Hosting service providers shall promptly examine every complaint that they receive and reinstate the content without undue delay where the removal or disabling of access was unjustified. They shall inform the complainant about the outcome of the examination within two weeks from the receipt of the complaint, with a clear explanation in cases where hosting service providers decide not to reinstate the content. A reinstatement of content shall not preclude further judicial measures against the decision of the hosting service provider or of the competent authority.
2019/02/25
Committee: LIBE
Amendment 597 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
2a. Notwithstanding the provisions of paragraphs 1 and 2, the complaint mechanism of the hosting service providers shall be complementary to the applicable laws and procedures of the Member State in regard to the right to judicial review.
2019/02/25
Committee: LIBE
Amendment 600 #
Proposal for a regulation
Article 11 – paragraph 1
1. Where hosting service providers removed terrorist content or disable access to it, they shall make available to the content provider comprehensible and concise information on the removal or disabling of access to terrorist content. , including the reasons for the removal or disabling of access, including the legal basis establishing the unlawfulness of the content and possibilities to contest the decision. Where applicable, they shall also provide the content provider with a copy of the removal order pursuant to Article 4.
2019/02/25
Committee: LIBE
Amendment 603 #
Proposal for a regulation
Article 11 – paragraph 2
2. Upon request of the content provider, the hosting service provider shall inform the content provider about the reasons for the removal or disabling of access and possibilities to contest the decision.deleted
2019/02/25
Committee: LIBE
Amendment 608 #
Proposal for a regulation
Article 11 – paragraph 3
3. The obligation pursuant to paragraphs 1 and 2 shall not apply where the competent authority decides that there should be no disclosure for reasons of public security, such as the prevention, investigation, detection and prosecution of terrorist offences, for as long as necessary, but not exceeding [four] weeks from that decision. In such a case, the hosting service provider shall not disclose any information on the removal or disabling of access to terrorist content.
2019/02/25
Committee: LIBE
Amendment 611 #
Proposal for a regulation
Article 12 – paragraph 1
Member States shall ensure that their competent authorities have the necessary capability and sufficient resources to achieve the aims and fulfil their obligations under this Regulation, with strong guarantees of independence.
2019/02/25
Committee: LIBE
Amendment 614 #
Proposal for a regulation
Article 12 a (new)
Article 12 a Judicial Redress Member States shall ensure that in cases where content has been removed or access to it has been disabled as a result of a removal order pursuant to Article 4 of this Regulation, a referral pursuant to Article 4(1) (m) of Regulation (EU) 2016/794, or additional measures pursuant to Article 6 of this Regulation, the content provider concerned can initiate judicial proceedings at any time requesting re- instatement of the content. Initiation of judicial proceedings shall not be conditional on the initiation of complaint mechanisms referred to in Article 10.
2019/02/25
Committee: LIBE
Amendment 620 #
Proposal for a regulation
Article 13 – paragraph 1
1. Competent authorities in Member States shall inform, coordinate and cooperate with each other and, where appropriate, with relevant Union bodies such as Europol with regard to removal orders and referrals to avoid duplication, enhance coordination and avoid interference with investigations in different Member States.
2019/02/25
Committee: LIBE
Amendment 627 #
Proposal for a regulation
Article 13 – paragraph 3 – point b
(b) the processing and feedback relating to referrals pursuant to Article 5;deleted
2019/02/25
Committee: LIBE
Amendment 638 #
Proposal for a regulation
Article 14 – paragraph 1
1. Hosting service providers shall establish a point of contact allowing for the receipt of removal orders and referrals by electronic means and ensure their swiftexpeditious processing pursuant to Articles 4 and 5. They shall ensure that this information is made publicly available.
2019/02/25
Committee: LIBE
Amendment 643 #
Proposal for a regulation
Article 14 – paragraph 2
2. The information referred to in paragraph 1 shall specify the official language or languages (s) of the Union, as referred to in Regulation 1/58, in which the contact point can be addressed and in which further exchanges in relation to removal orders and referrals pursuant to Articles 4 and 5 shall take place. This shall include at least one of the official languages of the Member State in which the hosting service provider has its main establishment or where its legal representative pursuant to Article 16 resides or is established.
2019/02/25
Committee: LIBE
Amendment 645 #
Proposal for a regulation
Article 14 – paragraph 3
3. Member States shall establish a point of contact to handle requests for clarification and feedback in relation to removal orders and referrals issued by them. Information about the contact point shall be made publicly available.
2019/02/25
Committee: LIBE
Amendment 650 #
Proposal for a regulation
Article 15 – paragraph 2
2. Where a hosting service provider which does not have its main establishment within one of the Member States fails to designate a legal representative, all Member States shall have jurisdiction.
2019/02/25
Committee: LIBE
Amendment 653 #
Proposal for a regulation
Article 15 – paragraph 3
3. Where an authority of another Member State has issued a removal order according to Article 4(1), that Member State has jurisdiction to take coercive measures according to its national law in order to enforce the removal order.deleted
2019/02/25
Committee: LIBE
Amendment 656 #
Proposal for a regulation
Article 15 – paragraph 3 a (new)
3a. An appeal as referred to in Article 4(9) shall be lodged with the court of the Member State where the hosting provider has its main establishment or where the legal representative designated by the hosting service provider pursuant to Article 16 resides or is established.
2019/02/25
Committee: LIBE
Amendment 658 #
Proposal for a regulation
Article 16 – paragraph 1
1. A hosting service provider which does not have an establishment in the Union but offers services in the Union, shall designate, in writing, a legal or natural person as its legal representative in the Union for the receipt of, compliance with and enforcement of removal orders, referrals, requests and decisions issued by the competent authorities on the basis of this Regulation. The legal representative shall reside or be established in one of the Member States where the hosting service provider offers the services.
2019/02/25
Committee: LIBE
Amendment 661 #
Proposal for a regulation
Article 16 – paragraph 2
2. The hosting service provider shall entrust the legal representative with the receipt, compliance and enforcement of the removal orders, referrals, requests and decisions referred to in paragraph 1 on behalf of the hosting service provider concerned. Hosting service providers shall provide their legal representative with the necessary powers and resource to cooperate with the competent authorities and comply with these decisions and orders.
2019/02/25
Committee: LIBE
Amendment 667 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Each Member State shall designate the judicial authority or authorities competent to
2019/02/25
Committee: LIBE
Amendment 670 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b) detect, identify and refer terrorist content to hosting service providers pursuant to Article 5;deleted
2019/02/25
Committee: LIBE
Amendment 677 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) oversee the implementation of proactiveadditional measures pursuant to Article 6;
2019/02/25
Committee: LIBE
Amendment 686 #
Proposal for a regulation
Article 18 – paragraph 1 – point c
(c) Article 5(5) and (6) (assessment of and feedback on referrals);deleted
2019/02/25
Committee: LIBE
Amendment 692 #
Proposal for a regulation
Article 18 – paragraph 1 – point d
(d) Article 6(2) and (4) (reports on proactive measures and the adoption of measures following a decision imposing specific proactiveadditional measures);
2019/02/25
Committee: LIBE
Amendment 695 #
Proposal for a regulation
Article 18 – paragraph 1 – point g
(g) Article 9 (safeguards in relation to proactive measurescontent removal);
2019/02/25
Committee: LIBE
Amendment 703 #
Proposal for a regulation
Article 18 – paragraph 3 – point d
(d) the financial strength of the legal person held liable; , and any gainful interest when breaching this Regulation;
2019/02/25
Committee: LIBE
Amendment 707 #
Proposal for a regulation
Article 18 – paragraph 3 – point e a (new)
(ea) the nature and size of the hosting service providers, in particular for microenterprises or small-sized enterprises within the meaning of Commission recommendation 2003/361/EC.
2019/02/25
Committee: LIBE
Amendment 713 #
Proposal for a regulation
Article 18 – paragraph 4
4. Member States shall ensure that a systematic and ongoing failure to comply with obligations pursuant to Article 4(2) is subject to financial penalties of up to 4% of the hosting service provider’s global turnover of the last business year.
2019/02/25
Committee: LIBE
Amendment 715 #
Proposal for a regulation
Article 19 – title
Technical requirements, criteria for assessing significance, and amendments to the templates for removal orders
2019/02/25
Committee: LIBE
Amendment 716 #
Proposal for a regulation
Article 19 – paragraph 1 a (new)
1a. The Commission shall be empowered to adopt delegated acts in accordance with Article 20 in order to complement this Regulation with criteria and figures to be used by competent authorities for determining what corresponds to a significant number of uncontested removal orders as referred to in this Regulation.
2019/02/25
Committee: LIBE
Amendment 717 #
Proposal for a regulation
Article 21 – paragraph 1 – point a
(a) information about the number of removal orders and referrals issued, the number of pieces of terrorist content which has been removed or access to it disabled, including the corresponding timeframes pursuant to Articles 4, and 5;information on the number of corresponding cases of successful detection, investigation and prosecution of terrorist offences
2019/02/25
Committee: LIBE
Amendment 722 #
Proposal for a regulation
Article 21 – paragraph 1 – point b
(b) information about the specific proactiveadditional measures taken pursuant to Article 6, including the amount of terrorist content which has been removed or access to it disabled and the corresponding timeframes;
2019/02/25
Committee: LIBE
Amendment 723 #
Proposal for a regulation
Article 21 – paragraph 1 – point b a (new)
(ba) information about the number of access requests issued by national competent authorities regarding content preserved by the hosting service providers pursuant to Article 7;
2019/02/25
Committee: LIBE
Amendment 725 #
Proposal for a regulation
Article 21 – paragraph 1 – point d
(d) information about the number of redress procedures initiated pursuant to Article 12a and decisions taken by the competent authority in accordance with national law.
2019/02/25
Committee: LIBE
Amendment 728 #
Proposal for a regulation
Article 23 – paragraph 1
No sooner than [three years from the date of application of this Regulation], the Commission shall carry out an evaluation of this Regulation and submit a report to the European Parliament and to the Council on the application of this Regulation including the functioning of the effectiveness of the safeguard mechanisms. The report shall also cover the impact of this Regulation on freedom of expression and information, on media and journalism, on the arts, and on academic research. Where appropriate, the report shall be accompanied by legislative proposals. Member States shall provide the Commission with the information necessary for the preparation of the report.
2019/02/25
Committee: LIBE
Amendment 733 #
Proposal for a regulation
Article 24 – paragraph 2
It shall apply from [612 months after its entry into force].
2019/02/25
Committee: LIBE
Amendment 735 #
Proposal for a regulation
Annex I – paragraph 1
Under Article 4 of Regulation (EU)….16 the addressee of the removal order shall remove terrorist content or disable access to it, within one hour fromas soon as possible after receipt of the removal order from the competent authority. _________________ 16Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online (OJ L …).
2019/02/25
Committee: LIBE
Amendment 736 #
Proposal for a regulation
Annex I – paragraph 2
In accordance with Article 7 of Regulation (EU) ….17 , addressees must preserve content and related data, which has been removed or access to it disabled, for six months or longer upon request from the competent authorities or courts. Addressees must delete the content and related data immediately thereafter. _________________ 17Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online (OJ L …).
2019/02/25
Committee: LIBE
Amendment 737 #
Proposal for a regulation
Annex I – section A – paragraph 7
Member State of jurisdiction of addressee: [if different to issuing state] …………………………………………… …………………………………….……… ……..deleted
2019/02/25
Committee: LIBE
Amendment 738 #
Proposal for a regulation
Annex I – section B – title
B Content to be removed or access to it disabled within one houras soon as possible:
2019/02/25
Committee: LIBE
Amendment 739 #
Proposal for a regulation
Annex I – section B – paragraph 3 – introductory part
Reason(s) explaining why the content is considered terrorist content, in accordance with Article 2 (5) of the Regulation (EU) xxx. The content (tick the relevant box(es)):
2019/02/25
Committee: LIBE
Amendment 740 #
Proposal for a regulation
Annex I – section B – paragraph 3 – subparagraph 1
[ ] incites, advocates or glorifies the commisison of terrorist offences (Article 2 (5) a)deleted
2019/02/25
Committee: LIBE
Amendment 741 #
Proposal for a regulation
Annex I – section B – paragraph 3 – subparagraph 2
[ ] encourages the contribution to terrorist offences (Article 2 (5) b)deleted
2019/02/25
Committee: LIBE
Amendment 742 #
Proposal for a regulation
Annex I – section B – paragraph 3 – subparagraph 3
[ ] promotes the activities of a terrorist group, encouraging participation in or support of the group (Article 2 (5) c)deleted
2019/02/25
Committee: LIBE
Amendment 743 #
Proposal for a regulation
Annex I – section B – paragraph 3 – subparagraph 4
[ ] provides instructions or techniques for committing terrorist offences (Article 2 (5) d)deleted
2019/02/25
Committee: LIBE
Amendment 744 #
Proposal for a regulation
Annex I – section B – paragraph 4
Additional information on the reasons why the content is considered terrorist content (optional): …………………………………………… …………………… …………………………………………… …………………………………………… ……….. …………………………………………… …………………………………………… ……….deleted
2019/02/25
Committee: LIBE
Amendment 745 #
Proposal for a regulation
Annex I – section C – paragraph 1 – subparagraph 1
[ ] for reasons of public security,(tick relevant box) □ ongoing criminal investigations □ preventing terrorist offences the addressee must refrain from informing the content provider whose content is being removed or or to which access has been disabled.
2019/02/25
Committee: LIBE
Amendment 746 #
Proposal for a regulation
Annex I – section C – paragraph 2
Otherwise: Details and deadlines of possibilities to contest the removal order in the issuing Member State (which canshall be passed to the content provider, if requested) under national law; see Section G below:
2019/02/25
Committee: LIBE
Amendment 747 #
Proposal for a regulation
Annex I – section D
D Informing Member State of jurisdiction [ ] Tick if the state of jurisidiction of the addressee is other than the issuing Member State: [ ] a copy of the removal order is sent to the relevant competent authority of the state of jurisdictiondeleted
2019/02/25
Committee: LIBE
Amendment 748 #
Proposal for a regulation
Annex I – section E – paragraph 1 – subparagraph 1
[ ] judge, [ ] court, or [ ] investigating judge
2019/02/25
Committee: LIBE
Amendment 749 #
Proposal for a regulation
Annex I – section E – paragraph 1 – subparagraph 2
[ ] law enforcement authoritydeleted
2019/02/25
Committee: LIBE
Amendment 750 #
Proposal for a regulation
Annex I – section E – paragraph 1 – subparagraph 3
[ ] other competent authority→ please complete also Section (F)deleted
2019/02/25
Committee: LIBE
Amendment 751 #
Proposal for a regulation
Annex I – section F – paragraph 3
Contact details of the authority of the state of jurisdiction of the addressee [if different to the issuing Member State]deleted
2019/02/25
Committee: LIBE
Amendment 753 #
Proposal for a regulation
Annex III – section B – point i – paragraph 3 a (new)
[ ] the removal order does not sufficiently establish the illegality of the content
2019/02/25
Committee: LIBE