BETA

29 Amendments of Milan ZVER related to 2018/0331(COD)

Amendment 75 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, tastriking into account the particular importance accorded toa balance with the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures may constitutinge legitimate interferences in the freedom of expression and information should bprovided that they are strictly targeted, in the sense that they must relate to specific content and serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
2019/02/08
Committee: CULT
Amendment 89 #
Proposal for a regulation
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution, including financial or logistical, to terrorist offences, provides instructions for the commission of such offences or promotes the participation in or dissemination of content related to activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content. _________________ 9Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
2019/02/08
Committee: CULT
Amendment 94 #
Proposal for a regulation
Recital 10
(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored publicly available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information publicly available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.
2019/02/08
Committee: CULT
Amendment 97 #
Proposal for a regulation
Recital 12
(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. In accordance with Article 15 of Directive 2000/31/EC, These duties of care should not amount to a general monitoring obligation and be without prejudice to Chapter IX bis of Directive (EU) 2018/1808, where applicable. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non- discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observancerespect of freedom of expression and information.
2019/02/08
Committee: CULT
Amendment 103 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. Given the disproportional high level of harm that a terrorist content can cause to the public or to the public order of a Member State, because of its high level of violence or its link to an on- going or very recent terrorist offence committed in the Member State concerned, Member States should be allowed in these cases to imposes obligations on hosting service providers to ensure that the terrorist content identified in the duly justified removal order is removed or access to it is disabled immediately from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.
2019/02/08
Committee: CULT
Amendment 119 #
Proposal for a regulation
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take targeted proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation in accordance with Article 15 or Directive 2000/31/EC and be without prejudice to Chapter IX bis of Directive (EU) 2018/1808 which allows video-sharing platforms to take measures to protect the general public from content whose dissemination constitutes a penal infraction under Union law. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.
2019/02/08
Committee: CULT
Amendment 130 #
Proposal for a regulation
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, for instance where it uses in part or whole terrorist content that is already subjected to a definitive removal order or where it is uploaded by users who already uploaded terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).
2019/02/08
Committee: CULT
Amendment 162 #
Proposal for a regulation
Recital 37
(37) For the purposes of this Regulation, Member States should designate competent authorities, including judicial, with the relevant expertise. The requirement to designate competent authorities does not necessarily require the establishment of new authorities but can be existing bodies tasked with the functions set out in this Regulation. This Regulation requires designating authorities competent for issuing removal orders, referrals and for overseeing proactive measures and for imposing penalties. It is for Member States to decide how many authorities they wish to designate for these tasks. Member States should notify to the European Commission the competent authorities they designated for the purpose of this Regulation.
2019/02/08
Committee: CULT
Amendment 189 #
Proposal for a regulation
Article 1 – paragraph 2
2. This Regulation shall apply to hosting service providers, as defined in this Regulation, offering services in the Union, irrespective of their place of main establishment.
2019/02/08
Committee: CULT
Amendment 195 #
Proposal for a regulation
Article 2 – paragraph 1 – point 1
(1) 'hosting service provider' means a provider of information society services consisting in the storage of information provided by and at the request of the content provider and in making the information stored publicly available to third parties;
2019/02/08
Committee: CULT
Amendment 219 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point c
(c) promoting the activities of a terrorist group, in particular by encouraging the participation in, meeting with, communicate with or support to a terrorist group within the meaning of Article 2(3) of Directive (EU) 2017/541, or by encouraging the dissemination of terrorist content;
2019/02/08
Committee: CULT
Amendment 234 #
Proposal for a regulation
Article 2 – paragraph 1 – point 6
(6) ‘dissemination of terrorist content’ means making terrorist content publicly available to third parties on the hosting service providers’ services;
2019/02/08
Committee: CULT
Amendment 241 #
Proposal for a regulation
Article 2 – paragraph 1 – point 9 a (new)
(9 a) ‘competent authority’ means a body, including judicial, with the relevant expertise designated or created by the Member State for the purpose of this Regulation.
2019/02/08
Committee: CULT
Amendment 246 #
Proposal for a regulation
Article 3 – paragraph 1
1. Hosting service providers shall take appropriate, reasonable and proportionate actions in accordance with this Regulation, against the dissemination of terrorist content and to protect users from terrorist content. In doing so, they shall act in a diligent, proportionate and non- discriminatory manner, and with due regard to striking a balance withe fundamental rights of the users and take into account the fundamental importance of the freedom of expression and information in an open and democratic society.
2019/02/08
Committee: CULT
Amendment 249 #
Proposal for a regulation
Article 3 – paragraph 2
2. Hosting service providers shall include in their terms and conditions, and apply, provisions to prevent the storing and dissemination of terrorist content on their services.
2019/02/08
Committee: CULT
Amendment 255 #
Proposal for a regulation
Article 4 – paragraph 2
2. Hosting service providers shall remove terrorist content or disable access to it within one hour from receipt of the removal order. Member States may provide that where a terrorist content is manifestly harmful or constitutes an immediate threat to the public order, hosting service providers shall remove or disable access to the terrorist content content from the moment of receipt of a duly justified removal order.
2019/02/08
Committee: CULT
Amendment 266 #
Proposal for a regulation
Article 4 – paragraph 3 – point b
(b) a statement of reasons explaining why the content is considered terrorist content, at least, by referenceing to the categories of terrorist content listed in Article 2(5);
2019/02/08
Committee: CULT
Amendment 272 #
Proposal for a regulation
Article 4 – paragraph 3 – point f
(f) information about redress and associated deadlines available to the hosting service provider and to the content provider;
2019/02/08
Committee: CULT
Amendment 284 #
Proposal for a regulation
Article 4 – paragraph 7 a (new)
7 a. If the hosting service provider is a SME and cannot comply with the removal order because of logistical impossibility due to its size and capacities, it shall inform, without undue delay, the competent authority, explaining the reasons, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the reasons invoked are no longer present.
2019/02/08
Committee: CULT
Amendment 289 #
Proposal for a regulation
Article 4 – paragraph 8
8. If the hosting service provider cannot comply with the removal order because the removal order contains manifest errors or does not contain sufficient technical information to execute the order, it shall inform the competent authority without undue delay, asking for the necessary clarification, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the clarification is provided.
2019/02/08
Committee: CULT
Amendment 307 #
Proposal for a regulation
Article 6 – paragraph 1
1. Hosting service providers shall, where appropriate, take proactive measures to protect their services against the dissemination of terrorist content. The measures shall be effective, targeted and proportionate, taking into account the risk and level of exposure to terrorist content, and strike a balance with the fundamental rights of the users, and the fundamental importance of the freedom of expression and information in an open and democratic society.
2019/02/08
Committee: CULT
Amendment 310 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – point b
(b) detecting, identifying and expeditiously removing or disabling access to terrorist content comprising, in part or whole, a terrorist content that was subject to a definitive removal order.
2019/02/08
Committee: CULT
Amendment 313 #
Proposal for a regulation
Article 6 – paragraph 3
3. Where the competent authority referred to in Article 17(1)(c) considers that the proactive measures taken and reported under paragraph 2 are disproportionate or are insufficient in mitigating and managing the risk and level of exposure, it may request the hosting service provider to adapt the measures already taken or to take specific additional proactive measures. For that purpose, the hosting service provider shall cooperate with the competent authority referred to in Article 17(1)(c) with a view to identifying the changes or specific measures that the hosting service provider shall put in place, establishing key objectives and benchmarks as well as timelines for their implementation.
2019/02/08
Committee: CULT
Amendment 316 #
Proposal for a regulation
Article 6 – paragraph 4
4. Where no agreement can be reached within the three months from the request pursuant to paragraph 3, the competent authority referred to in Article 17(1)(c) may issue a decision imposing specific additional necessary and proportionate proactive measures. The decision shall take into account, in particular, the type of content hosted on the service, the technical feasibility of the measures, the economic capacity of the hosting service provider and the effect of such measures on the fundamental rights of the users and the fundamental importance of the freedom of expression and information. Such a decision shall be sent to the main establishment of the hosting service provider or to the legal representative designated by the service provider. The hosting service provider shall regularly report on the implementation of such measures as specified by the competent authority referred to in Article 17(1)(c).
2019/02/08
Committee: CULT
Amendment 321 #
Proposal for a regulation
Article 7 – paragraph 1 – point b a (new)
(b a) the treatment of complaints issued in accordance with Article 10.
2019/02/08
Committee: CULT
Amendment 351 #
Proposal for a regulation
Article 9 – paragraph 2
2. Safeguards shall consist, in particular, of human oversight and verifications wherof the appropriate and, in any event,ness of the decision to remove or deny access to content, in particular with regard to the right to freedom of expression and information. Human oversight shall be required where a detailed assessment of the relevant context is required in order to determine whether or not the content is to be considered terrorist content.
2019/02/08
Committee: CULT
Amendment 357 #
Proposal for a regulation
Article 10 – paragraph 1
1. Without prejudice to the remedies, including judicial, available to content providers under national law, Hosting service providers shall establish effective and accessible mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 6, to submit a substantiated complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/08
Committee: CULT
Amendment 370 #
Proposal for a regulation
Article 11 – paragraph 2
2. Upon request of the content provider, tThe hosting service provider shall inform the content provider about the reasons for the removal or disabling of access and possibilities to contest the decision.
2019/02/08
Committee: CULT
Amendment 402 #
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
3. Member States shall ensure that, when determining the type and level of penalties, the competent authorities take into account all relevant circumstances, in particular in the case of SMEs, including:
2019/02/08
Committee: CULT