BETA

24 Amendments of Luigi MORGANO related to 2018/0331(COD)

Amendment 54 #
Proposal for a regulation
Recital 1
(1) This Regulation aims at ensuring the smooth functioning of the digital single market in an open and democratic society, by preventing the misuse of hosting services for terrorist purposes and providing a specific tool for countering such issues and helping to ensure freedom and security for citizens. The functioning of the digital single market should be improved by reinforcing legal certainty for hosting service providers, reinforcing users' trust in the online environment, and by strengthening safeguards to the freedom of expression and information.
2019/02/08
Committee: CULT
Amendment 59 #
Proposal for a regulation
Recital 2
(2) Hosting service providers active on the internet play an essential role in the digital economy by connecting business and citizens and by facilitating public debate and the distribution and receipt of information, opinions and ideas, contributing significantly to innovation, economic growth and job creation in the Union. However, their services are in certain cases abused by third parties to carry out illegal activities online, which are a criminal offence under Union law. Of particular concern is the misuse of hosting service providers by terrorist groups and their supporters to disseminate terrorist content online in order to spread their message, to radicalise and recruit and to facilitate and direct terrorist activity.
2019/02/08
Committee: CULT
Amendment 61 #
Proposal for a regulation
Recital 3
(3) TWhile not the only factor, the presence of terrorist content online has proven to be crucial in terms of radicalising individuals who have committed terrorist acts within the Union and beyond, which has had very serious negative consequences for users, for citizens and society at large as well as, but also for the online service providers hosting such content, since it undermines the trust of their users and damages their business models. IAccordingly, in light of their central role and professional capabilities, in addition to the technological means and capabilities associated with the services they provide, while taking account of the importance of safeguarding the fundamental freedoms of expression and information, online service providers have particular societal responsibilities to protect their services from misuse by terrorists and to help tackle terrorist content disseminated through their services.
2019/02/08
Committee: CULT
Amendment 64 #
Proposal for a regulation
Recital 4
(4) Efforts at Union level to counter terrorist content online commenced in 2015 through a framework of voluntary cooperation between Member States and hosting service providers need. Unfortunately, that cooperation turned out to be insufficient to counter this phenomenon; Union law therefore needs to be complemented by a clear legislative framework in order to further reduce accessibility to terrorist content online and adequately address a rapidly evolving problem. This legislative framework seeks to build on voluntary efforts, which were reinforced by the Commission Recommendation (EU) 2018/3347 and responds to calls made by the European Parliament to strengthen measures to tackle illegal and harmful content and by the European Council to improve the automatic detection and removal of content that incites to terrorist acts. _________________ 7Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ L 63, 6.3.2018, p. 50).
2019/02/08
Committee: CULT
Amendment 65 #
Proposal for a regulation
Recital 5
(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision, recalling that Article 14 requires service providers to act expeditiously to remove or to disable access to illegal content upon receiving knowledge of illegal activity or information. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met. _________________ 8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
2019/02/08
Committee: CULT
Amendment 76 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt exclusively measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference in the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
2019/02/08
Committee: CULT
Amendment 77 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards tohat ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference ipotentially impacting on the freedom of expression and information should be strictly targeted, in the sense that they must serve to prevent the dissemination of terrorist content, but without therebyand should not affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
2019/02/08
Committee: CULT
Amendment 83 #
Proposal for a regulation
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9. Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. CObviously, content disseminated for educational, journalistic or research purposes should be identified and adequately protected. Furthermore and should not be equated with incitement to terrorism unless the dissemination of such content enables it to be used for terrorist purposes; a fair balance will thus be struck between freedom of expression and information and public security requirements. In particular, any decision to remove journalistic content should take account of journalists' codes of self- regulation and ethics, in accordance with Article 11 of the Charter of Fundamental Rights of the European Union. In the interest of consistency, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content. _________________ 9Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
2019/02/08
Committee: CULT
Amendment 96 #
Proposal for a regulation
Recital 12
(12) Hosting service providers should apply certain duties of care, in order to prevent and deter the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation and be without prejudice to Article 15 of Directive 2000/31/EC. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non- discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of freedom of expression and information. Effective and expeditious complaints and redress mechanisms should be available by the hosting service providers in the case of unjustified removals of content.
2019/02/08
Committee: CULT
Amendment 107 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task, by making provision for appropriate procedures to ensure that every decision complies with the definition of terrorist content, for instance by providing for independent supervision. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.
2019/02/08
Committee: CULT
Amendment 109 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content -or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. In case of delays the nature and size of the hosting service providers should be taken into account, particularly in the case of microenterprises or small-sized enterprises. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.
2019/02/08
Committee: CULT
Amendment 116 #
Proposal for a regulation
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation pursuant to Article 15 of Directive 2000/31/EC and Article 28(b), paragraph 1(c) of Directive (EU) 2018/1808, which requires video- sharing platform providers to take appropriate measures to protect the general public from programmes containing content the dissemination of which constitutes an activity which is a criminal offence under Union law. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.
2019/02/08
Committee: CULT
Amendment 125 #
Proposal for a regulation
Recital 17
(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights. Hosting service providers should put in place effective and expeditious complaints and redress mechanisms to address cases of unjustified removals of content.
2019/02/08
Committee: CULT
Amendment 132 #
Proposal for a regulation
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken, as well as on the functioning of the complaints and redress mechanisms. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).
2019/02/08
Committee: CULT
Amendment 176 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down uniform rules to prevent and counter the misuse of hosting services for the dissemination of terrorist content online. It lays down in particular:
2019/02/08
Committee: CULT
Amendment 183 #
Proposal for a regulation
Article 1 – paragraph 1 – point b
(b) a set of measures to be put in place by Member States to identify terrorist content, to enable its swift removal by hosting service providers and to facilitate cooperation with the competent authorities in other Member States, hosting service providers and where appropriate relevant Union bodies, while complying with Union legislation specifying clear safeguards with regard to freedom of expression and information.
2019/02/08
Committee: CULT
Amendment 184 #
Proposal for a regulation
Article 1 – paragraph 1 – point b
(b) a set of measures to be put in place by Member States to identify terrorist content, to enable its swift removal by hosting service providers and to facilitate cooperation with the relevant competent authorities in other Member States, hosting service providers and where appropriate relevant Union bodies.
2019/02/08
Committee: CULT
Amendment 250 #
Proposal for a regulation
Article 3 – paragraph 2
2. Hosting service providers shall include in their terms and conditions, and apply, effective and proportionate provisions to prevent the dissemination of terrorist content.
2019/02/08
Committee: CULT
Amendment 308 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – point a
(a) effectively preventing the re-upload of content which has previously been removed or to which access has been disabled because it is considered to befound to contain terrorist content;
2019/02/08
Committee: CULT
Amendment 343 #
Proposal for a regulation
Article 8 – paragraph 3 – point d
(d) overview and outcome ofassessment of the effectiveness of the complaint procand reduress mechanisms.
2019/02/08
Committee: CULT
Amendment 348 #
Proposal for a regulation
Article 9 – paragraph 1
1. Where hosting service providers use automated tools pursuant to this Regulation in respect of content that they store, they shall provide effective and appropriate safeguards to ensure that decisions taken concerning that content, in particular decisions to remove or disable access to content considered to be terrorist content, are accurate and well-founded.
2019/02/08
Committee: CULT
Amendment 355 #
Proposal for a regulation
Article 10 – title
10 Complaint and redress mechanisms
2019/02/08
Committee: CULT
Amendment 362 #
Proposal for a regulation
Article 10 – paragraph 1
1. Hosting service providers shall establish expeditious, effective and accessible complaints and redress mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 6, to submit a complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/08
Committee: CULT
Amendment 404 #
Proposal for a regulation
Article 18 – paragraph 3 – point e a (new)
(e a) the nature and size of the hosting service providers, in particular microenterprises or small-sized enterprises, within the meaning of the Commission recommendation 2003/361/EC.
2019/02/08
Committee: CULT