18 Amendments of Arndt KOHN related to 2018/0331(COD)
Amendment 133 #
Proposal for a regulation
Recital 10
Recital 10
(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply, to the extent that it is possible to identify and remove specific content that is the subject of this Regulation, to information society services which store information provided by a recipient of the service at his or her request and in processing and making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.
Amendment 142 #
Proposal for a regulation
Recital 13
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within onesix hours from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.
Amendment 152 #
Proposal for a regulation
Recital 16
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what justified, appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation applying in an indiscriminate or unlimited manner. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.
Amendment 159 #
Proposal for a regulation
Recital 19
Recital 19
(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks based on appropriate, sufficient and relevant evidence. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.
Amendment 165 #
Proposal for a regulation
Recital 25
Recital 25
(25) Complaint procedures constitute a necessary safeguard against erroneous removal of content protected under the freedom of expression and information. Hosting service providers should therefore establish user-friendly complaint mechanisms and ensure that complaints are dealt with promptly and in full transparency towards the content provider. Content providers should also have the right to complain directly to the competent authority if they are unable to resolve their complaint with a hosting service provider. The requirement for the hosting service provider to reinstate the content where it has been removed in error, does not affect the possibility of hosting service providers to enforce their own terms and conditions on other grounds.
Amendment 176 #
Proposal for a regulation
Recital 38
Recital 38
(38) Penalties are necessary to ensure the effective implementation by hosting service providers of the obligations pursuant to this Regulation, and should also take into account the situation of subsidiaries or linked undertakings where applicable. Member States should adopt rules on penalties, including, where appropriate, fining guidelines. Particularly severe penalties shall be ascertained in the event that the hosting service provider systematically fails to remove terrorist content or disable access to it within onesix hours from receipt of a removal order. Non-compliance in individual cases could be sanctioned while respecting the principles of ne bis in idem and of proportionality and ensuring that such sanctions take account of systematic failure. In order to ensure legal certainty, the regulation should set out to what extent the relevant obligations can be subject to penalties. Penalties for non- compliance with Article 6 should only be adopted in relation to obligations arising from a request to report pursuant to Article 6(2) or a decision imposing additional proactive measures pursuant to Article 6(4). When determining whether or not financial penalties should be imposed, due account should be taken of the financial resources of the provider. Member States shall ensure that penalties do not encourage the removal of content which is not terrorist content.
Amendment 182 #
Proposal for a regulation
Recital 41
Recital 41
(41) Member States should collect information on the implementation of the legislation, including on policies, terms and conditions and transparency reports of hosting service providers. A detailed programme for monitoring the outputs, results and impacts of this Regulation should be established in order to inform an evaluation of the legislation.
Amendment 192 #
Proposal for a regulation
Article 1 – paragraph 2
Article 1 – paragraph 2
2. This Regulation shall apply to hosting service providers offering services in the Union, irrespective of their place of main establishment, to the extent that it is possible for such providers to identify and remove specific content alleged to fall within the scope of Article 2(5).
Amendment 195 #
Proposal for a regulation
Article 1 – paragraph 2 a (new)
Article 1 – paragraph 2 a (new)
2 a. The application of this Regulation shall be subject to Union law regarding fundamental rights, freedoms and values as enshrined in particular in Articles 2 and 6 of the Treaty on the European Union.
Amendment 203 #
Proposal for a regulation
Article 2 – paragraph 1 – point 1
Article 2 – paragraph 1 – point 1
(1) 'hosting service provider' means a provider of information society services consisting in the storage or processing of information provided by and at the request of the content provider and in making thesuch information stored available to third parties;
Amendment 247 #
Proposal for a regulation
Article 4 – paragraph 1
Article 4 – paragraph 1
1. The competent authority of a Member State shall have the power to issue a decisionremoval order requiring the hosting service provider to remove terrorist content or disable access to it, and shall immediately inform the competent authorities of any other Member States whose interests it considers may be concerned that a removal order has been issued.
Amendment 261 #
Proposal for a regulation
Article 4 – paragraph 2
Article 4 – paragraph 2
2. Hosting service providers shall remove terrorist content or disable access to it within onesix hours from receipt of the removal order.
Amendment 282 #
Proposal for a regulation
Article 5 – paragraph 4
Article 5 – paragraph 4
4. The referral shall contain sufficiently detailed information, including the reasons why the content is considered terrorist content, a URL and, where necessary, additional information enabling the identification of the terrorist content referred, including screenshots if obtainable.
Amendment 291 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
1. Hosting service providers shall, where justified and appropriate, take proactive measures to protect their services against the dissemination of terrorist content. The measures shall be effective and proportionate, taking into account the risk and level of exposure to terrorist content, the fundamental rights of the users, and the fundamental importance of the freedom of expression and information in an open and democratic society. Such measures shall be taken in accordance with Article 3(1) and in particular shall not include a system for monitoring or filtering all user content on an indiscriminate and unlimited basis.
Amendment 311 #
Proposal for a regulation
Article 6 – paragraph 4
Article 6 – paragraph 4
4. Where no agreement can be reached within the three months from the request pursuant to paragraph 3, the competent authority referred to in Article 17(1)(c) may issue a decision imposing specific additional necessary and proportionate proactive measures. The decision shall be based on appropriate, sufficient and relevant evidence and shall take into account, in particular, the economic capacity of the hosting service provider and the effect of such measures on the fundamental rights of the users and the fundamental importance of the freedom of expression and information. Such a decision shall be sent to the main establishment of the hosting service provider or to the legal representative designated by the service provider. The hosting service provider shall regularly report on the implementation of such measures as specified by the competent authority referred to in Article 17(1)(c).
Amendment 322 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Hosting service providers shall set out in their terms and conditions their policy to prevent the dissemination of, and protection of users from, terrorist content, including, where appropriate, a meaningful explanation of the functioning of proactive measures including the use of automated tools.
Amendment 337 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Where hosting service providers use any automated tools pursuant toor other proactive measures pursuant to, or otherwise in pursuit of the aims of, this Regulation in respect of content that they store, they shall provide effective and appropriate safeguards to ensure that decisions taken concerning that content, in particular decisions to remove or disable content considered to be terrorist content, are accurate and well-founded, and do not lead to the removal or disabling of access to content that is not terrorist content.
Amendment 345 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
Article 10 – paragraph 2 a (new)
2 a. Content providers shall have the right to appeal to the competent authority in the event that they are unable to resolve their complaint regarding the removal of content or disabling of access directly with the hosting service provider.