BETA

67 Amendments of Jana TOOM related to 2018/0331(COD)

Amendment 53 #
Proposal for a regulation
Title 1
Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on preventing the dissemination of illegal terrorist content online A contribution from the European Commission to the Leaders’ meeting in Salzburg on 19-20 September 2018
2019/02/08
Committee: CULT
Amendment 56 #
Proposal for a regulation
Recital 1
(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for dissemination of illegal terrorist purposescontent. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users' trust in the online environment, and by strengthening safeguards to the freedom of expression and information.
2019/02/08
Committee: CULT
Amendment 60 #
Proposal for a regulation
Recital 2
(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online. Of particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.
2019/02/08
Committee: CULT
Amendment 63 #
Proposal for a regulation
Recital 3
(3) The presence of terrorist content online has serious negative consequences for users, for citizens and society at large as well as for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. In light of their central role and the technological means and capabilities associated with the services they provide, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tacklecompetent authorities to tackle illegal terrorist content disseminated through their services.
2019/02/08
Committee: CULT
Amendment 68 #
Proposal for a regulation
Recital 5
(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactiveadditional measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met. _________________ 8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
2019/02/08
Committee: CULT
Amendment 74 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities as defined in this Regulation and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, the rights to privacy and personal data protection, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist contenttaken under this Regulation should be necessary, appropriate and proportionate to the aim they pursue to contribute to the fight against terrorism, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
2019/02/08
Committee: CULT
Amendment 79 #
Proposal for a regulation
Recital 8
(8) The right to an effective remedy is enshrined in Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union. Each natural or legal person has the right to an effective judicial remedy before the competent national court against any of the measures taken pursuant to this Regulation, which can adversely affect the rights of that person. The right includes, in particular the possibility for hosting service providers and content providers to be informed about redress, the possibility for content providers to appeal against removal decisions taken by the hosting service provider and the possibility for hosting service providers and content providers to effectively contest the removal orders before the court of the Member State whose authorities issued the removal order.
2019/02/08
Committee: CULT
Amendment 87 #
Proposal for a regulation
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to preventaddress the dissemination of illegal terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes illegal terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content and falls outside of the scope of this Regulation. _________________ 9Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
2019/02/08
Committee: CULT
Amendment 93 #
Proposal for a regulation
Recital 10
(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services whichose main business activity consists of storeing information provided by a recipient of the service at his or her request and in making the information stored publicly available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information publicly available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed tohosting illegal terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.
2019/02/08
Committee: CULT
Amendment 100 #
Proposal for a regulation
Recital 12
(12) Hosting service providers should apply certain duties of care, in order to prevenlimit the dissemination of illegal terrorist content on their services. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non- discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information.
2019/02/08
Committee: CULT
Amendment 108 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an legal assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designatedesignate the competent authority among their independent administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that illegal terrorist content identified in the removal order is removed or access to it is disabled within one hour fromexpeditiously after receiving the removal order, taking into account the capacity and resources of the hosting service provider. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union based on the definition of illegal terrorist content and provided that effective redress mechanism are in place.
2019/02/08
Committee: CULT
Amendment 114 #
Proposal for a regulation
Recital 15
(15) Referrals by the competent authorities or Europol constitute an effective and swift means of making hosting service providers aware of specific content on their services. This mechanism of alerting hosting service providers to information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility its own terms and conditions, should remain available in addition to removal orders. It is important that hosting service providers assess such referrals as a matter of priority and provide swift feedback about action taken. The ultimate decision about whether or not to remove the content because it is not compatible with their terms and conditions remains with the hosting service provider. In implementing this Regulation related to referrals, Europol’s mandate as laid down in Regulation (EU) 2016/79413 remains unaffected. _________________ 13Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA (OJ L 135, 24.5.2016, p. 53).deleted
2019/02/08
Committee: CULT
Amendment 118 #
Proposal for a regulation
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing illegal terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist contentadditional measures could be taken, provided that these measures are appropriate, proportional and necessary to achieve the aim of this Regulation. These additional measures should not imply a general monitoring obligation.
2019/02/08
Committee: CULT
Amendment 124 #
Proposal for a regulation
Recital 17
(17) When putting in place proactiveadditional measures, hosting service providers should ensure that users’ rights to freedom of expression and information - including to freely receive and impart information - isand the rights to privacy and to protection of personal data - are preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not illegal terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.
2019/02/08
Committee: CULT
Amendment 128 #
Proposal for a regulation
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union)The service provider should report on the specific additional measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification.
2019/02/08
Committee: CULT
Amendment 133 #
Proposal for a regulation
Recital 19
(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasonsThe measures taken by the hosting service provider should not lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.
2019/02/08
Committee: CULT
Amendment 138 #
Proposal for a regulation
Recital 21
(21) The obligation to preserve the content for proceedings of administrative or judicial review is necessary and justified in view of ensuring the effective measures of redress for the content provider whose content was removed or access to it disabled as well as for ensuring the reinstatement of that content as it was prior to its removal depending on the outcome of the review procedure. The obligation to preserve content for investigative and prosecutorial purposes is justified and necessary in view of the value this material could bring for the purpose of disrupting or preventing terrorist activity. Where companies remove material or disable access to it, in particular through their own proactive measures, and do not inform the relevant authority because they assess that it does not fall in the scope of Article 13(4) of this Regulation, law enforcement may be unaware of the existence of the content. Therefore, the preservation of content for purposes of prevention, detection, investigation and prosecution of terrorist offences is also justified. For these purposes, the required preservation of data is limited to data that is likely to have a link with terrorist offences, and can therefore contribute to prosecuting terrorist offences or to preventing serious risks to public security.
2019/02/08
Committee: CULT
Amendment 142 #
Proposal for a regulation
Recital 24
(24) Transparency of hosting service providers' policies in relation to terrorist content is essential to enhance their accountability towards their users and to reinforce trust of citizens in the Digital Single Market. Hosting service providers should publish annual transparency reports containing meaningful information about action taken in relation to the detection, identification and removal of illegal terrorist content.
2019/02/08
Committee: CULT
Amendment 148 #
Proposal for a regulation
Recital 26
(26) Effective legal protection according to Article 19 TEU and Article 47 of the Charter of Fundamental Rights of the European Union requires that persons are able to ascertain the reasons upon which the content uploaded by them has been removed or access to it disabled. For that purpose, the hosting service provider should make available to the content provider meaningful information enabling the content provider to contest the decision. However, this does not necessarily require a notification to the content provider. Depending on the circumstances, hosting service providers may replace content which is considered illegal terrorist content, with a message that it has been removed or disabled in accordance with this Regulation. Further information about the reasons as well as possibilities for the content provider to contest the decision should be given upon request. Where competent authorities decide that for reasons of public security including in the context of an investigation, it is considered inappropriate or counter-productive to directly notify the content provider of the removal or disabling of content, they should inform the hosting service provider.
2019/02/08
Committee: CULT
Amendment 168 #
Proposal for a regulation
Recital 38
(38) Penalties arcan be necessary to ensure the effective implementation by hosting service providers of the obligations pursuant to this Regulation. Member States should adopt rules on penalties, including, where appropriate, fining guidelines. Particularly severe penalties shall be ascertained in the event that the hosting service provider systematically fails to remove illegal terrorist content or disable access to it within one hour from receipt of a removal orexpeditiously, taking into account the size and resources of the hosting service provider. Non-compliance in individual cases could be sanctioned while respecting the principles of ne bis in idem and of proportionality and ensuring that such sanctions take account of systematic failure. In order to ensure legal certainty, the regulation should set out to what extent the relevant obligations can be subject to penalties. Penalties for non-compliance with Article 6 should only be adopted in relation to obligations arising from a request to report pursuant to Article 6(2) or a decision imposing additional proactive measures pursuant to Article 6(4). When determining whether or not financial penalties should be imposed, due account should be taken of the financial resources of the provider. Member States shall ensure that penalties do not encourage the removal of content which is not illegal terrorist content.
2019/02/08
Committee: CULT
Amendment 177 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down uniform rules to preventaddress the misuse of hosting services for the dissemination of illegal terrorist content online. It lays down in particular:
2019/02/08
Committee: CULT
Amendment 181 #
Proposal for a regulation
Article 1 – paragraph 1 – point a
(a) rules on duties of care to be applied by hosting service providers in order to preventtackle the dissemination of illegal terrorist content through their services andby ensureing, where necessary, its swift removal;
2019/02/08
Committee: CULT
Amendment 186 #
Proposal for a regulation
Article 1 – paragraph 1 – point b
(b) a set of measures to be put in place by Member States to identify illegal terrorist content, to enable its swift removal by hosting service providers and to facilitate cooperation with the competent authorities in other Member States, hosting service providers and where appropriate relevant Union bodies.
2019/02/08
Committee: CULT
Amendment 194 #
Proposal for a regulation
Article 2 – paragraph 1 – point 1
(1) 'hosting service provider' means a provider of information society services consistingwhose main business activity consists of in the storage of information provided by and at the request of the content provider and in making the information stored publicly available to third parties;
2019/02/08
Committee: CULT
Amendment 206 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – introductory part
(5) 'illegal terrorist content' means one or more of the following information:
2019/02/08
Committee: CULT
Amendment 209 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point a
(a) inciting or advocating, including by glorifying, the commission of terrorist offences, thereby causing a danger that such acts be committedunlawfully and intentionally the commission of terrorist offences within the meaning of Article 3 (1) of Directive 2017/541, where such conduct manifestly causes clear, substantial and imminent danger that one or more such offences be committed and is punishable as a criminal offence when committed intentionally;
2019/02/08
Committee: CULT
Amendment 213 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point b
(b) encouraging the contribution to terrorist offences;distributing or otherwise making available by other means online a message to the public, with a clear intent to: - recruit for terrorism within the meaning of Article 6 of Directive 2017/541 - provide training for terrorism within the meaning of Article 7 of Directive 2017/541 - organise or otherwise facilitating travelling for the purpose of terrorism within the meaning of Article 10 of Directive 2017/541
2019/02/08
Committee: CULT
Amendment 217 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point c
(c) promoting the activities of a terrorist group, in particular by encouraging the participation in or support to a terrorist group within the meaning of Article 2(3) of Directive (EU) 2017/541;deleted
2019/02/08
Committee: CULT
Amendment 224 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point d
(d) instructing on methods or techniques for the purpose of committing terrorist offences.deleted
2019/02/08
Committee: CULT
Amendment 232 #
Proposal for a regulation
Article 2 – paragraph 1 – point 6
(6) ‘dissemination of illegal terrorist content’ means making illegal terrorist content publicly available to third parties on the hosting service providers’ services;
2019/02/08
Committee: CULT
Amendment 236 #
Proposal for a regulation
Article 2 – paragraph 1 – point 8
(8) 'referral' means a notice by a competent authority or, where applicable, a relevant Union body to a hosting service provider about information that may be considered terrorist content, for the provider’s voluntary consideration of the compatibility with its own terms and conditions aimed to prevent dissemination of terrorism content;deleted
2019/02/08
Committee: CULT
Amendment 244 #
Proposal for a regulation
Article 3 – paragraph 1
1. Hosting service providers that have been subjected to a substantial number of uncontested removal order shall take appropriate, reasonable and proportionate actions in accordance with this Regulation, against the dissemination of illegal terrorist content and to protect users from illegal terrorist content. In doing so, they shall act in a diligent, proportionate and non- discriminatory manner, and with due regard to the fundamental rights of the users and take into account the fundamental importance of the freedom of expression and information in an open and democratic society.
2019/02/08
Committee: CULT
Amendment 247 #
Proposal for a regulation
Article 3 – paragraph 2
2. Hosting service providers shall include in their terms and conditions, and apply, provisions to prevent the dissemination of terrorist content.deleted
2019/02/08
Committee: CULT
Amendment 258 #
Proposal for a regulation
Article 4 – paragraph 2
2. Hosting service providers shall remove illegal terrorist content or disable access to it within one hour from receipt of the removal orin an expeditious manner, taking into account the size and resources of the hosting service provider.
2019/02/08
Committee: CULT
Amendment 264 #
Proposal for a regulation
Article 4 – paragraph 3 – point b
(b) a detailed statement of reasons explaining why the content is considered illegal terrorist content, at least, by specific reference to the categories of illegal terrorist content listed in Article 2(5);
2019/02/08
Committee: CULT
Amendment 271 #
Proposal for a regulation
Article 4 – paragraph 3 – point f
(f) information about redress and the deadline for redress available to the hosting service provider and to the content provider;
2019/02/08
Committee: CULT
Amendment 274 #
Proposal for a regulation
Article 4 – paragraph 3 – point g
(g) where relevantnecessary and appropriate, the decision not to disclose information about the removal of illegal terrorist content or the disabling of access to it referred to in Article 11.
2019/02/08
Committee: CULT
Amendment 275 #
Proposal for a regulation
Article 4 – paragraph 4
4. Upon request by the hosting service provider or by the content provider, the competent authority shall provide a detailed statement of reasons, without prejudice to the obligation of the hosting service provider to comply with the removal order within the deadline set out in paragraph 2.deleted
2019/02/08
Committee: CULT
Amendment 279 #
Proposal for a regulation
Article 4 – paragraph 5
5. The competent authorities shall address removal orders to the main establishment of the hosting service provider or to the legal representative designated by the hosting service provider pursuant to Article 16 and transmit it to the point of contact referred to in Article 14(1). Such orders shall be sent by electronic means capable of producing a written record under conditions allowing to establish the authentication of the sender, including the accuracy of the date and the time of sending and receipt of the order. Such orders shall be made in one of the languages specified in accordance with Article 14(2).
2019/02/08
Committee: CULT
Amendment 285 #
Proposal for a regulation
Article 4 – paragraph 8
8. If the hosting service provider cannotrefuses to comply with the removal order because the removal order contains manifest errors or, does not contain sufficient information to execute the order, or does not sufficiently establish the illegality of the content in light of fundamental rights, it shall inform the competent authority without undue delay, asking for the necessary clarification, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the clarification is provided.
2019/02/08
Committee: CULT
Amendment 292 #
Proposal for a regulation
Article 4 – paragraph 9
9. The competent authority which issued the removal order shall inform the competent authority which oversees the implementation of proactive measures, referred to in Article 17(1)(c) when the removal order becomes final. A removal order becomes final where it has not been appealed and redress has not been appealedsought within the deadline according to the applicable national law or where it has been confirmed following an appeal.
2019/02/08
Committee: CULT
Amendment 294 #
Proposal for a regulation
Article 4 a (new)
Article 4 a Cross-border cooperation related to removal orders 1. Where a competent authority of a Member State other than the one in which the main establishment of the hosting service provider or its designated representative is located wishes to request a removal order, it shall make the request to the competent authority referred to in Article 17(1)(a) of the Member State in which the main establishment of the hosting service provider or its designated representative is located. 2. The competent authority of the Member State in which the main establishment of the hosting service provider or its designated representative is located may issue the removal order requested in accordance with paragraph 1 to the hosting service provider in accordance with Article 4(5) provided that it meets all requirements set out in Article 4 under the hosting provider’s jurisdiction. 3. In cases where the competent authority of the Member State in which the main establishment of the hosting service provider is located does not issue the removal order, for example because it does not comply with Article 4 or because the competent authority has reasonable grounds to believe that the removal order may impact fundamental interests of that Member State, it shall inform the requesting competent authority accordingly.
2019/02/08
Committee: CULT
Amendment 296 #
1. The competent authority or the relevant Union body may send a referral to a hosting service provider. 2. Hosting service providers shall put in place operational and technical measures facilitating the expeditious assessment of content that has been sent by competent authorities and, where applicable, relevant Union bodies for their voluntary consideration. 3. The referral shall be addressed to the main establishment of the hosting service provider or to the legal representative designated by the service provider pursuant to Article 16 and transmitted to the point of contact referred to in Article 14(1). Such referrals shall be sent by electronic means. 4. The referral shall contain sufficiently detailed information, including the reasons why the content is considered terrorist content, a URL and, where necessary, additional information enabling the identification of the terrorist content referred. 5. The hosting service provider shall, as a matter of priority, assess the content identified in the referral against its own terms and conditions and decide whether to remove that content or to disable access to it. 6. The hosting service provider shall expeditiously inform the competent authority or relevant Union body of the outcome of the assessment and the timing of any action taken as a result of the referral. 7. Where the hosting service provider considers that the referral does not contain sufficient information to assess the referred content, it shall inform without delay the competent authorities or relevant Union body, setting out what further information or clarification is required.Article 5 deleted Referrals
2019/02/08
Committee: CULT
Amendment 303 #
Proposal for a regulation
Article 6
[...]deleted
2019/02/08
Committee: CULT
Amendment 328 #
Proposal for a regulation
Article 8 – paragraph 1
1. Hosting service providers shall clearly set out in their terms and conditions their policy to prevent the dissemination of illegal terrorist content, including, where appropriate, a meaningful explanation of the functioning of proactive measures including the use of automated toolsa description of the mechanism established in accordance with Article 10.
2019/02/08
Committee: CULT
Amendment 331 #
Proposal for a regulation
Article 8 – paragraph 2
2. Hosting service providers and competent authorities shall publish annual transparency reports on action taken against the dissemination of illegal terrorist content.
2019/02/08
Committee: CULT
Amendment 336 #
Proposal for a regulation
Article 8 – paragraph 3 – point a
(a) information about the hosting service provider’s measures in relation to the detection, identification and removal of illegal terrorist content;
2019/02/08
Committee: CULT
Amendment 337 #
Proposal for a regulation
Article 8 – paragraph 3 – point b
(b) information about the hosting service provider’s measures to prevent the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist content;deleted
2019/02/08
Committee: CULT
Amendment 342 #
Proposal for a regulation
Article 8 – paragraph 3 – point c
(c) number of pieces of illegal terrorist content removed or to which access has been disabled, following removal orders, referrals, or proactive or additional measures, respectively;
2019/02/08
Committee: CULT
Amendment 346 #
Proposal for a regulation
Article 9 – title
Safeguards regarding the use and implementation of proactiveadditional measures
2019/02/08
Committee: CULT
Amendment 350 #
Proposal for a regulation
Article 9 – paragraph 1
1. Where hosting service providers use automated tools pursuant to this Regulation in respect of content that they store, they shall provide effective and appropriate safeguards to ensure that decisions taken concerning that content, in particular decisions to remove or disable content considered to be illegal terrorist content, are accurate and well-founded.
2019/02/08
Committee: CULT
Amendment 353 #
Proposal for a regulation
Article 9 – paragraph 2
2. Safeguards shall consist, in particular, of human oversight and verifications where appropriate and, in any event, where a detailed assessment of the relevant context is required in order to determine whether or not the content is to be considered illegal terrorist content.
2019/02/08
Committee: CULT
Amendment 358 #
Proposal for a regulation
Article 10 – paragraph 1
1. Hosting service providers shall establish effective and accessible mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 64(m) of Regulation 2016/794, a removal order pursuant to Article 4 of this Regulation or additional measures, to submit a complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/08
Committee: CULT
Amendment 366 #
Proposal for a regulation
Article 11 – paragraph 1
1. Where hosting service providers removed illegal terrorist content or disable access to it, they shall make available to the content provider information on the removal or disabling of access to illegal terrorist content.
2019/02/08
Committee: CULT
Amendment 369 #
Proposal for a regulation
Article 11 – paragraph 2
2. Upon request of the content provider, tThe hosting service provider shall inform the content provider about the reasons for the removal or disabling of access and possibilities to contest the decision.
2019/02/08
Committee: CULT
Amendment 374 #
Proposal for a regulation
Article 13 – paragraph 1
1. Competent authorities in Member States shall inform, coordinate and cooperate with each other and, where appropriate, with relevant Union bodies such as Europol with regard to removal orders and referrals to avoid duplication, enhance coordination and avoid interference with investigations in different Member States.
2019/02/08
Committee: CULT
Amendment 375 #
Proposal for a regulation
Article 13 – paragraph 3 – point b
(b) the processing and feedback relating to referrals pursuant to Article 5;deleted
2019/02/08
Committee: CULT
Amendment 376 #
Proposal for a regulation
Article 13 – paragraph 3 – point c
(c) co-operation with a view to identify and implement proactive measures pursuant to Article 6.deleted
2019/02/08
Committee: CULT
Amendment 381 #
Proposal for a regulation
Article 14 – paragraph 1
1. Hosting service providers shall establish a point of contact allowing for the receipt of removal orders and referrals by electronic means and ensure their swift processing pursuant to Articles 4 and 5. They shall ensure that this information is made publicly available.
2019/02/08
Committee: CULT
Amendment 383 #
Proposal for a regulation
Article 14 – paragraph 2
2. The information referred to in paragraph 1 shall specify the official language or languages (s) of the Union, as referred to in Regulation 1/58, in which the contact point can be addressed and in which further exchanges in relation to removal orders and referrals pursuant to Articles 4 and 5 shall take place. This shall include at least one of the official languages of the Member State in which the hosting service provider has its main establishment or where its legal representative pursuant to Article 16 resides or is established.
2019/02/08
Committee: CULT
Amendment 386 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Each Member State shall designate the independent authority or authorities competent to
2019/02/08
Committee: CULT
Amendment 389 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b) detect, identify and refer terrorist content to hosting service providers pursuant to Article 5;deleted
2019/02/08
Committee: CULT
Amendment 392 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) oversee the implementation of proactive measures pursuant to Article 6;
2019/02/08
Committee: CULT
Amendment 396 #
Proposal for a regulation
Article 18 – paragraph 1 – point c
(c) Article 5(5) and (6) (assessment of and feedback on referrals);deleted
2019/02/08
Committee: CULT
Amendment 397 #
Proposal for a regulation
Article 18 – paragraph 1 – point d
(d) Article 6(2) and (4) (reports on proactive measures and the adoption of measures following a decision imposing specific proactive measures);deleted
2019/02/08
Committee: CULT
Amendment 413 #
Proposal for a regulation
Article 21 – paragraph 1 – point a
(a) information about the number of removal orders and referrals issued, the number of pieces of illegal terrorist content which has been removed or access to it disabled, including the corresponding timeframes pursuant to Articles 4 and 5;
2019/02/08
Committee: CULT
Amendment 415 #
Proposal for a regulation
Article 21 – paragraph 1 – point b
(b) information about the specific proactiveadditional measures taken pursuant to Article 6, including the amount of illegal terrorist content which has been removed or access to it disabled and the corresponding timeframes;
2019/02/08
Committee: CULT