BETA

29 Amendments of Andrea BOCSKOR related to 2018/0331(COD)

Amendment 110 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designatedesignate as competent authority a national body for the purpose of administrative, law enforcement or judicial authorities with that tasks. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union, with regard to the specifics of their service and reach.
2019/02/08
Committee: CULT
Amendment 127 #
Proposal for a regulation
Recital 17
(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect illigal terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.
2019/02/08
Committee: CULT
Amendment 135 #
Proposal for a regulation
Recital 19
(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are iunsufficientatisfactory to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.
2019/02/08
Committee: CULT
Amendment 146 #
Proposal for a regulation
Recital 25
(25) Complaint procedures constitute a necessary safeguard against erroneous removal of content protected under the freedom of expression and information. Hosting service providers should thereforeThe relevant competent authorities should, in co-operation with hosting service providers establish user-friendly complaint mechanisms and ensure that complaints are dealt with promptly and in full transparency towards the content provider. The requirement for the hosting service provider to reinstate the content where it has been removed in error, does not affect the possibility of hosting service providers to enforce their own terms and conditions on other grounds.
2019/02/08
Committee: CULT
Amendment 159 #
Proposal for a regulation
Recital 34
(34) In the absence of a general requirement for service providers to ensure a physical presence within the territory of the Union, there is a need to ensure clarity under which Member State's jurisdiction the hosting service provider offering services within the Union falls. As a general rule, the hosting service provider falls under the jurisdiction of the Member State in which it has its main establishment or in which it has designated a legal representative. Nevertheless, where another Member State issues a removal order, its authoritiesy should be able to enforce theirits orders by taking coercive measures of a non-punitive nature, such as penalty payments. With regards to a hosting service provider which has no establishment in the Union and does not designate a legal representative, any Member State should, nevertheless, be able to issue penalties, provided that the principle of ne bis in idem is respected.
2019/02/08
Committee: CULT
Amendment 165 #
Proposal for a regulation
Recital 37
(37) For the purposes of this Regulation, each Member States should designate one competent authoritiesy. The requirement to designate one competent authoritiesy does not necessarily require the establishment of a new authoritiesy but can be an existing bodiesy tasked with the functions set out in this Regulation. This Regulation requires designating authorities competent for issuing removal orders, referrals and for overseeing proactive measures and for imposing penalties. It is for Member States to decide how many authorities they wish to designate for these tasks.
2019/02/08
Committee: CULT
Amendment 169 #
Proposal for a regulation
Recital 38
(38) Penalties are necessary to ensure the effective implementation by hosting service providers of the obligations pursuant to this Regulation. Member States should adopt rules on penalties, including, where appropriate, fining guidelines. Particularly severe penalties shall be ascertained in the event that the hosting service provider systematically fails to remove terrorist content or disable access to it within one hour from receipt of a removal order. Non-compliance in individual cases could be sanctioned while respecting the principles of ne bis in idem and of proportionality and ensuring that such sanctions take account of systematic failure. In order to ensure legal certainty, the regulation should set out to what extent the relevant obligations can be subject to penalties and that the penalties should not be criminal in nature. Penalties for non- compliance with Article 6 should only be adopted in relation to obligations arising from a request to report pursuant to Article 6(2) or a decision imposing additional proactive measures pursuant to Article 6(4). When determining whether or not financial penalties should be imposed, due account should be taken of the financial resources of the provider. Member States shall ensure that penalties do not encourage the removal of content which is not terrorist content.
2019/02/08
Committee: CULT
Amendment 265 #
Proposal for a regulation
Article 4 – paragraph 3 – point b
(b) a comprehensive statement of reasons explaining why the content is considered terrorist content, at least, by reference to the categories of terrorist content listed in Article 2(5);
2019/02/08
Committee: CULT
Amendment 291 #
Proposal for a regulation
Article 4 – paragraph 9
9. The competent authority which issued the removal order shall inform the competent authority which oversees the implementation of proactive measures, referred to in Article 17(1)(c) when the removal order becomes final. A removal order becomes final where it has not been appealed within the deadline according to the applicable national law or where it has been confirmed following an appeal.
2019/02/08
Committee: CULT
Amendment 298 #
Proposal for a regulation
Article 5 – paragraph 4
4. The referral shall contain sufficiently detailed information, including a comprehensive list of the reasons why the content is considered terrorist content, a URL and, where necessary, additional information enabling the identification of the terrorist content referred.
2019/02/08
Committee: CULT
Amendment 304 #
Proposal for a regulation
Article 6 – paragraph 1
1. Hosting service providers shall, where appropriate, take proactive measures to protect their services against the dissemination of terrorist content, without prejudice to Directive 2000/31/EC and Directive 2018/1808/EU. The measures shall be effective and proportionate, taking into account the risk and level of exposure to terrorist content, the fundamental rights of the users, and the fundamental importance of the freedom of expression and information in an open and democratic society.,
2019/02/08
Committee: CULT
Amendment 323 #
Proposal for a regulation
Article 7 – paragraph 2
2. The terrorist content and related data referred to in paragraph 1 shall be preserved for six months. The terrorist content shall, upon request from the competent authority or court, be preserved only for a longer period when and for as long as necessary for ongoing proceedings of administrative or judicial review referred to in paragraph 1(a).
2019/02/08
Committee: CULT
Amendment 327 #
Proposal for a regulation
Article 8 – paragraph 1
1. Hosting service providers shall set out in their terms and conditions their policy to collaborate with the competent judicial or independent administrative authorities and to prevent the dissemination of terrorist content, including, where appropriate, a meaningful explanation of the functioning of proactive measures including the use of automated tools.
2019/02/08
Committee: CULT
Amendment 330 #
Proposal for a regulation
Article 8 – paragraph 2
2. Hosting service providers and competent authorities and Union bodies shall publish annual transparency reports on action taken against the dissemination of terrorist content.
2019/02/08
Committee: CULT
Amendment 333 #
Proposal for a regulation
Article 8 – paragraph 3 – introductory part
3. Transparency reports of the hosting service providers shall include at least the following information:
2019/02/08
Committee: CULT
Amendment 339 #
Proposal for a regulation
Article 8 – paragraph 3 – point b
(b) detailed information about the hosting service provider’s measures to prevent the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist content;
2019/02/08
Committee: CULT
Amendment 341 #
Proposal for a regulation
Article 8 – paragraph 3 – point c
(c) number of pieces of illegal terrorist content removed or to which access has been disabled, following removal orders, referrals, or proactive measures, respectively;
2019/02/08
Committee: CULT
Amendment 359 #
Proposal for a regulation
Article 10 – paragraph 1
1. HThe relevant Union bodies and competent authorities shall, in co- operation with hosting service providers shall, establish effective and accessible mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 6, to submit a complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/08
Committee: CULT
Amendment 379 #
Proposal for a regulation
Article 13 – paragraph 4
4. Where hosting service providers become aware of any evidence of terrorist offences, they shall promptly inform the authoritiesy competent for the investigation and prosecution in criminal offences in the concerned Member State or the point of contact in the Member State pursuant to Article 14(2), where they have their main establishment or a legal representative. Hosting service providers may, in case of doubt, transmit this information to Europol for appropriate follow up.
2019/02/08
Committee: CULT
Amendment 387 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Each Member State shall designate thone authority or authorities competent to
2019/02/08
Committee: CULT
Amendment 391 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) oversee the implementapplication of proactive measures pursuant to Article 6;
2019/02/08
Committee: CULT
Amendment 394 #
Proposal for a regulation
Article 17 – paragraph 1 a (new)
1 a. Member States shall ensure that ‘competent authority’ means a national or European body with the power to issue, enforce and amend binding legal orders in their relevant jurisdictions.
2019/02/08
Committee: CULT
Amendment 398 #
Proposal for a regulation
Article 18 – paragraph 1 – point d
(d) Article 6(2) and (4) (reports on proactive measures and the adoption of measuressuch following a decision imposing specific proactive measures);
2019/02/08
Committee: CULT
Amendment 399 #
Proposal for a regulation
Article 18 – paragraph 1 – point g
(g) Article 9 (safeguards in relregarding the use and implementation tof proactive measures);
2019/02/08
Committee: CULT
Amendment 400 #
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
3. Member States shall ensure that, when determining the type and level of penalties, and those penalties should not be seen as criminal, the competent authorities take into account all relevant circumstances, including:
2019/02/08
Committee: CULT
Amendment 408 #
Proposal for a regulation
Article 19 – paragraph 1
1. The Commission shall be empowered to adopt delegated acts in accordance with Article 20 in order to supplement this Regulation with the necessary technical requirements for the electronic means to be used by competent authorities for the transmission of removal orders.
2019/02/08
Committee: CULT
Amendment 409 #
Proposal for a regulation
Article 19 – paragraph 2
2. The Commission shall be empowered to adopt such delegated acts to amend Annexes I, II and III in order to effectivecompetently address a possible need for improvements regarding the content of removal order forms and of forms to be used to provide information on the impossibility to execute the removal order.
2019/02/08
Committee: CULT
Amendment 411 #
Proposal for a regulation
Article 20 – paragraph 2
2. The power to adopt delegated acts referred to in Article 19 shall be conferred on the Commission for an in determinate period of time3 years from [date of application of this Regulation].
2019/02/08
Committee: CULT
Amendment 414 #
Proposal for a regulation
Article 21 – paragraph 1 – point b
(b) information about the specific proactive measures taken pursuant to Article 6, including the amount of illegal terrorist content which has been removed or access to it disabled and the corresponding timeframes;
2019/02/08
Committee: CULT