BETA

28 Amendments of Giorgos GRAMMATIKAKIS related to 2018/0331(COD)

Amendment 54 #
Proposal for a regulation
Recital 1
(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes and providing a specific tool for countering such issues and helping to ensure freedom and security for citizens. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users' trust in the online environment, and by strengthening safeguards to the freedom of expression and information.
2019/02/08
Committee: CULT
Amendment 61 #
Proposal for a regulation
Recital 3
(3) TWhile not the only factor, the presence of terrorist content online has proven to be crucial in terms of radicalising individuals who have committed terrorist acts within the Union and beyond, which has had very serious negative consequences for users, for citizens and society at large as well as, but also for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. IAccordingly, in light of their central role and professional capabilities, in addition to the technological means and capabilities associated with the services they provide, while taking account of the importance of safeguarding the fundamental freedoms of expression and information, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services.
2019/02/08
Committee: CULT
Amendment 64 #
Proposal for a regulation
Recital 4
(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need. Unfortunately, that cooperation turned out to be insufficient to counter this phenomenon; Union law therefore needs to be complemented by a clear legislative framework in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European Parliament to strengthen measures to tackle illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts. _________________ 7Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).
2019/02/08
Committee: CULT
Amendment 65 #
Proposal for a regulation
Recital 5
(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision, recalling that Article 14 requires service providers to act expeditiously to remove or to disable access to illegal content upon receiving knowledge of illegal activity or information. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met. _________________ 8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
2019/02/08
Committee: CULT
Amendment 76 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt exclusively measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
2019/02/08
Committee: CULT
Amendment 77 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards tohat ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference ipotentially impacting on the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without therebyand should not affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
2019/02/08
Committee: CULT
Amendment 88 #
Proposal for a regulation
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protectedexcluded from the scope of this Regulation, provided that it does not incite the commission of violence. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content, provided that it does not incite the commission of violence. _________________ 9Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
2019/02/08
Committee: CULT
Amendment 96 #
Proposal for a regulation
Recital 12
(12) Hosting service providers should apply certain duties of care, in order to prevent and deter the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation and be without prejudice to Article 15 of Directive 2000/31/EC. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non- discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information. Effective and expeditious complaints and redress mechanisms should be available by the hosting service providers in the case of unjustified removals of content.
2019/02/08
Committee: CULT
Amendment 109 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content -or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. In case of delays the nature and size of the hosting service providers should be taken into account, particularly in the case of microenterprises or small-sized enterprises. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.
2019/02/08
Committee: CULT
Amendment 121 #
Proposal for a regulation
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate, effective and proportionate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation and is without prejudice to Article 15 of Directive 2000/31/EC. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.
2019/02/08
Committee: CULT
Amendment 125 #
Proposal for a regulation
Recital 17
(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights. Hosting service providers should put in place effective and expeditious complaints and redress mechanisms to address cases of unjustified removals of content.
2019/02/08
Committee: CULT
Amendment 132 #
Proposal for a regulation
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken, as well as on the functioning of the complaints and redress mechanisms. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).
2019/02/08
Committee: CULT
Amendment 137 #
Proposal for a regulation
Recital 19
(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, only as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.
2019/02/08
Committee: CULT
Amendment 167 #
Proposal for a regulation
Recital 38
(38) Penalties are necessary to ensure the effective implementation by hosting service providers of the obligations pursuant to this Regulation. Member States should adopt rules on such penalties, including, where appropriate, fining guidelineswhich should be proportionate and practicable, taking into account the size and the nature of the hosting services provider concerned . Particularly severe penalties shall be ascertained in the event that the hosting service provider systematically fails to remove terrorist content or disable access to it within one hour from receipt of a removal order. Non- compliance in individual cases could be sanctioned while respecting the principles of ne bis in idem and of proportionality and ensuring that such sanctions take account of systematic failure, but do not encourage the arbitrary removal of content which is not terrorist content. In order to ensure legal certainty, the regulation should set out to what extent the relevant obligations can be subject to penalties. Penalties for non-compliance with Article 6 should only be adopted in relation to obligations arising from a request to report pursuant to Article 6(2) or a decision imposing additional proactive measures pursuant to Article 6(4). When determining whether or not financial penalties should be imposed, due account should be taken of the financial resources of the provider. Member States shall ensure that penalties do not encourage the removal of content which is not terrorist content.
2019/02/08
Committee: CULT
Amendment 176 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down uniform rules to prevent and counter the misuse of hosting services for the dissemination of terrorist content online. It lays down in particular:
2019/02/08
Committee: CULT
Amendment 184 #
Proposal for a regulation
Article 1 – paragraph 1 – point b
(b) a set of measures to be put in place by Member States to identify terrorist content, to enable its swift removal by hosting service providers and to facilitate cooperation with the relevant competent authorities in other Member States, hosting service providers and where appropriate relevant Union bodies.
2019/02/08
Committee: CULT
Amendment 205 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – introductory part
(5) 'terrorist content' means one or more of the following informationany material, other than material used for educational, journalistic and research purposes, provided that it does not incite the commission of violence, that:
2019/02/08
Committee: CULT
Amendment 211 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point a
(a) incitinges or advocatinges, including by glorifying, the commission of terrorist offences, thereby causing a danger that such acts be committed;
2019/02/08
Committee: CULT
Amendment 216 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point b
(b) encouraginges the contribution to terrorist offences;
2019/02/08
Committee: CULT
Amendment 223 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point c
(c) promotinges the activities of a terrorist group, in particular by encouraging the participation in or support to a terrorist group within the meaning of Article 2(3) of Directive (EU) 2017/541;
2019/02/08
Committee: CULT
Amendment 228 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point d
(d) instructings on methods or techniques for the purpose of committing terrorist offences.
2019/02/08
Committee: CULT
Amendment 250 #
Proposal for a regulation
Article 3 – paragraph 2
2. Hosting service providers shall include in their terms and conditions, and apply, effective and proportionate provisions to prevent the dissemination of terrorist content.
2019/02/08
Committee: CULT
Amendment 308 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – point a
(a) effectively preventing the re-upload of content which has previously been removed or to which access has been disabled because it is considered to befound to contain terrorist content;
2019/02/08
Committee: CULT
Amendment 343 #
Proposal for a regulation
Article 8 – paragraph 3 – point d
(d) overview and outcome ofassessment of the effectiveness of the complaint procand reduress mechanisms.
2019/02/08
Committee: CULT
Amendment 348 #
Proposal for a regulation
Article 9 – paragraph 1
1. Where hosting service providers use automated tools pursuant to this Regulation in respect of content that they store, they shall provide effective and appropriate safeguards to ensure that decisions taken concerning that content, in particular decisions to remove or disable access to content considered to be terrorist content, are accurate and well-founded.
2019/02/08
Committee: CULT
Amendment 355 #
Proposal for a regulation
Article 10 – title
10 Complaint and redress mechanisms
2019/02/08
Committee: CULT
Amendment 362 #
Proposal for a regulation
Article 10 – paragraph 1
1. Hosting service providers shall establish expeditious, effective and accessible complaints and redress mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 6, to submit a complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/08
Committee: CULT
Amendment 404 #
Proposal for a regulation
Article 18 – paragraph 3 – point e a (new)
(e a) the nature and size of the hosting service providers, in particular microenterprises or small-sized enterprises, within the meaning of the Commission recommendation 2003/361/EC.
2019/02/08
Committee: CULT