BETA

Activities of Svetoslav Hristov MALINOV related to 2018/0331(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online
2016/11/22
Committee: CULT
Dossiers: 2018/0331(COD)
Documents: PDF(276 KB) DOC(199 KB)

Amendments (58)

Amendment 75 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, tastriking into account the particular importance accorded toa balance with the freedom of expression and information, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures may constitutinge legitimate interferences in the freedom of expression and information should bprovided that they are strictly targeted, in the sense that they must relate to specific content and serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law.
2019/02/08
Committee: CULT
Amendment 89 #
Proposal for a regulation
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online, this Regulation should establish a definition of terrorist content for preventative purposes drawing on the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 . Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution, including financial or logistical, to terrorist offences, provides instructions for the commission of such offences or promotes the participation in or dissemination of content related to activities of a terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, journalistic or research purposes should be adequately protected. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content. _________________ 9Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
2019/02/08
Committee: CULT
Amendment 94 #
Proposal for a regulation
Recital 10
(10) In order to cover those online hosting services where terrorist content is disseminated, this Regulation should apply to information society services which store information provided by a recipient of the service at his or her request and in making the information stored publicly available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive nature. By way of example such providers of information society services include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services to the extent they make the information publicly available to third parties and websites where users can make comments or post reviews. The Regulation should also apply to hosting service providers established outside the Union but offering services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation.
2019/02/08
Committee: CULT
Amendment 97 #
Proposal for a regulation
Recital 12
(12) Hosting service providers should apply certain duties of care, in order to prevent the dissemination of terrorist content on their services. In accordance with Article 15 of Directive 2000/31/EC, These duties of care should not amount to a general monitoring obligation and be without prejudice to Chapter IX bis of Directive (EU) 2018/1808, where applicable. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non- discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observancerespect of freedom of expression and information.
2019/02/08
Committee: CULT
Amendment 103 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. Given the disproportional high level of harm that a terrorist content can cause to the public or to the public order of a Member State, because of its high level of violence or its link to an on- going or very recent terrorist offence committed in the Member State concerned, Member States should be allowed in these cases to imposes obligations on hosting service providers to ensure that the terrorist content identified in the duly justified removal order is removed or access to it is disabled immediately from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.
2019/02/08
Committee: CULT
Amendment 110 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designatedesignate as competent authority a national body for the purpose of administrative, law enforcement or judicial authorities with that tasks. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union, with regard to the specifics of their service and reach.
2019/02/08
Committee: CULT
Amendment 119 #
Proposal for a regulation
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take targeted proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation in accordance with Article 15 or Directive 2000/31/EC and be without prejudice to Chapter IX bis of Directive (EU) 2018/1808 which allows video-sharing platforms to take measures to protect the general public from content whose dissemination constitutes a penal infraction under Union law. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.
2019/02/08
Committee: CULT
Amendment 127 #
Proposal for a regulation
Recital 17
(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect illigal terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights.
2019/02/08
Committee: CULT
Amendment 130 #
Proposal for a regulation
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, for instance where it uses in part or whole terrorist content that is already subjected to a definitive removal order or where it is uploaded by users who already uploaded terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).
2019/02/08
Committee: CULT
Amendment 135 #
Proposal for a regulation
Recital 19
(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are iunsufficientatisfactory to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.
2019/02/08
Committee: CULT
Amendment 146 #
Proposal for a regulation
Recital 25
(25) Complaint procedures constitute a necessary safeguard against erroneous removal of content protected under the freedom of expression and information. Hosting service providers should thereforeThe relevant competent authorities should, in co-operation with hosting service providers establish user-friendly complaint mechanisms and ensure that complaints are dealt with promptly and in full transparency towards the content provider. The requirement for the hosting service provider to reinstate the content where it has been removed in error, does not affect the possibility of hosting service providers to enforce their own terms and conditions on other grounds.
2019/02/08
Committee: CULT
Amendment 159 #
Proposal for a regulation
Recital 34
(34) In the absence of a general requirement for service providers to ensure a physical presence within the territory of the Union, there is a need to ensure clarity under which Member State's jurisdiction the hosting service provider offering services within the Union falls. As a general rule, the hosting service provider falls under the jurisdiction of the Member State in which it has its main establishment or in which it has designated a legal representative. Nevertheless, where another Member State issues a removal order, its authoritiesy should be able to enforce theirits orders by taking coercive measures of a non-punitive nature, such as penalty payments. With regards to a hosting service provider which has no establishment in the Union and does not designate a legal representative, any Member State should, nevertheless, be able to issue penalties, provided that the principle of ne bis in idem is respected.
2019/02/08
Committee: CULT
Amendment 162 #
Proposal for a regulation
Recital 37
(37) For the purposes of this Regulation, Member States should designate competent authorities, including judicial, with the relevant expertise. The requirement to designate competent authorities does not necessarily require the establishment of new authorities but can be existing bodies tasked with the functions set out in this Regulation. This Regulation requires designating authorities competent for issuing removal orders, referrals and for overseeing proactive measures and for imposing penalties. It is for Member States to decide how many authorities they wish to designate for these tasks. Member States should notify to the European Commission the competent authorities they designated for the purpose of this Regulation.
2019/02/08
Committee: CULT
Amendment 165 #
Proposal for a regulation
Recital 37
(37) For the purposes of this Regulation, each Member States should designate one competent authoritiesy. The requirement to designate one competent authoritiesy does not necessarily require the establishment of a new authoritiesy but can be an existing bodiesy tasked with the functions set out in this Regulation. This Regulation requires designating authorities competent for issuing removal orders, referrals and for overseeing proactive measures and for imposing penalties. It is for Member States to decide how many authorities they wish to designate for these tasks.
2019/02/08
Committee: CULT
Amendment 169 #
Proposal for a regulation
Recital 38
(38) Penalties are necessary to ensure the effective implementation by hosting service providers of the obligations pursuant to this Regulation. Member States should adopt rules on penalties, including, where appropriate, fining guidelines. Particularly severe penalties shall be ascertained in the event that the hosting service provider systematically fails to remove terrorist content or disable access to it within one hour from receipt of a removal order. Non-compliance in individual cases could be sanctioned while respecting the principles of ne bis in idem and of proportionality and ensuring that such sanctions take account of systematic failure. In order to ensure legal certainty, the regulation should set out to what extent the relevant obligations can be subject to penalties and that the penalties should not be criminal in nature. Penalties for non- compliance with Article 6 should only be adopted in relation to obligations arising from a request to report pursuant to Article 6(2) or a decision imposing additional proactive measures pursuant to Article 6(4). When determining whether or not financial penalties should be imposed, due account should be taken of the financial resources of the provider. Member States shall ensure that penalties do not encourage the removal of content which is not terrorist content.
2019/02/08
Committee: CULT
Amendment 189 #
Proposal for a regulation
Article 1 – paragraph 2
2. This Regulation shall apply to hosting service providers, as defined in this Regulation, offering services in the Union, irrespective of their place of main establishment.
2019/02/08
Committee: CULT
Amendment 195 #
Proposal for a regulation
Article 2 – paragraph 1 – point 1
(1) 'hosting service provider' means a provider of information society services consisting in the storage of information provided by and at the request of the content provider and in making the information stored publicly available to third parties;
2019/02/08
Committee: CULT
Amendment 219 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point c
(c) promoting the activities of a terrorist group, in particular by encouraging the participation in, meeting with, communicate with or support to a terrorist group within the meaning of Article 2(3) of Directive (EU) 2017/541, or by encouraging the dissemination of terrorist content;
2019/02/08
Committee: CULT
Amendment 234 #
Proposal for a regulation
Article 2 – paragraph 1 – point 6
(6) ‘dissemination of terrorist content’ means making terrorist content publicly available to third parties on the hosting service providers’ services;
2019/02/08
Committee: CULT
Amendment 241 #
Proposal for a regulation
Article 2 – paragraph 1 – point 9 a (new)
(9 a) ‘competent authority’ means a body, including judicial, with the relevant expertise designated or created by the Member State for the purpose of this Regulation.
2019/02/08
Committee: CULT
Amendment 246 #
Proposal for a regulation
Article 3 – paragraph 1
1. Hosting service providers shall take appropriate, reasonable and proportionate actions in accordance with this Regulation, against the dissemination of terrorist content and to protect users from terrorist content. In doing so, they shall act in a diligent, proportionate and non- discriminatory manner, and with due regard to striking a balance withe fundamental rights of the users and take into account the fundamental importance of the freedom of expression and information in an open and democratic society.
2019/02/08
Committee: CULT
Amendment 249 #
Proposal for a regulation
Article 3 – paragraph 2
2. Hosting service providers shall include in their terms and conditions, and apply, provisions to prevent the storing and dissemination of terrorist content on their services.
2019/02/08
Committee: CULT
Amendment 255 #
Proposal for a regulation
Article 4 – paragraph 2
2. Hosting service providers shall remove terrorist content or disable access to it within one hour from receipt of the removal order. Member States may provide that where a terrorist content is manifestly harmful or constitutes an immediate threat to the public order, hosting service providers shall remove or disable access to the terrorist content content from the moment of receipt of a duly justified removal order.
2019/02/08
Committee: CULT
Amendment 265 #
Proposal for a regulation
Article 4 – paragraph 3 – point b
(b) a comprehensive statement of reasons explaining why the content is considered terrorist content, at least, by reference to the categories of terrorist content listed in Article 2(5);
2019/02/08
Committee: CULT
Amendment 266 #
Proposal for a regulation
Article 4 – paragraph 3 – point b
(b) a statement of reasons explaining why the content is considered terrorist content, at least, by referenceing to the categories of terrorist content listed in Article 2(5);
2019/02/08
Committee: CULT
Amendment 272 #
Proposal for a regulation
Article 4 – paragraph 3 – point f
(f) information about redress and associated deadlines available to the hosting service provider and to the content provider;
2019/02/08
Committee: CULT
Amendment 284 #
Proposal for a regulation
Article 4 – paragraph 7 a (new)
7 a. If the hosting service provider is a SME and cannot comply with the removal order because of logistical impossibility due to its size and capacities, it shall inform, without undue delay, the competent authority, explaining the reasons, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the reasons invoked are no longer present.
2019/02/08
Committee: CULT
Amendment 289 #
Proposal for a regulation
Article 4 – paragraph 8
8. If the hosting service provider cannot comply with the removal order because the removal order contains manifest errors or does not contain sufficient technical information to execute the order, it shall inform the competent authority without undue delay, asking for the necessary clarification, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the clarification is provided.
2019/02/08
Committee: CULT
Amendment 291 #
Proposal for a regulation
Article 4 – paragraph 9
9. The competent authority which issued the removal order shall inform the competent authority which oversees the implementation of proactive measures, referred to in Article 17(1)(c) when the removal order becomes final. A removal order becomes final where it has not been appealed within the deadline according to the applicable national law or where it has been confirmed following an appeal.
2019/02/08
Committee: CULT
Amendment 298 #
Proposal for a regulation
Article 5 – paragraph 4
4. The referral shall contain sufficiently detailed information, including a comprehensive list of the reasons why the content is considered terrorist content, a URL and, where necessary, additional information enabling the identification of the terrorist content referred.
2019/02/08
Committee: CULT
Amendment 304 #
Proposal for a regulation
Article 6 – paragraph 1
1. Hosting service providers shall, where appropriate, take proactive measures to protect their services against the dissemination of terrorist content, without prejudice to Directive 2000/31/EC and Directive 2018/1808/EU. The measures shall be effective and proportionate, taking into account the risk and level of exposure to terrorist content, the fundamental rights of the users, and the fundamental importance of the freedom of expression and information in an open and democratic society.,
2019/02/08
Committee: CULT
Amendment 307 #
Proposal for a regulation
Article 6 – paragraph 1
1. Hosting service providers shall, where appropriate, take proactive measures to protect their services against the dissemination of terrorist content. The measures shall be effective, targeted and proportionate, taking into account the risk and level of exposure to terrorist content, and strike a balance with the fundamental rights of the users, and the fundamental importance of the freedom of expression and information in an open and democratic society.
2019/02/08
Committee: CULT
Amendment 310 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – point b
(b) detecting, identifying and expeditiously removing or disabling access to terrorist content comprising, in part or whole, a terrorist content that was subject to a definitive removal order.
2019/02/08
Committee: CULT
Amendment 313 #
Proposal for a regulation
Article 6 – paragraph 3
3. Where the competent authority referred to in Article 17(1)(c) considers that the proactive measures taken and reported under paragraph 2 are disproportionate or are insufficient in mitigating and managing the risk and level of exposure, it may request the hosting service provider to adapt the measures already taken or to take specific additional proactive measures. For that purpose, the hosting service provider shall cooperate with the competent authority referred to in Article 17(1)(c) with a view to identifying the changes or specific measures that the hosting service provider shall put in place, establishing key objectives and benchmarks as well as timelines for their implementation.
2019/02/08
Committee: CULT
Amendment 316 #
Proposal for a regulation
Article 6 – paragraph 4
4. Where no agreement can be reached within the three months from the request pursuant to paragraph 3, the competent authority referred to in Article 17(1)(c) may issue a decision imposing specific additional necessary and proportionate proactive measures. The decision shall take into account, in particular, the type of content hosted on the service, the technical feasibility of the measures, the economic capacity of the hosting service provider and the effect of such measures on the fundamental rights of the users and the fundamental importance of the freedom of expression and information. Such a decision shall be sent to the main establishment of the hosting service provider or to the legal representative designated by the service provider. The hosting service provider shall regularly report on the implementation of such measures as specified by the competent authority referred to in Article 17(1)(c).
2019/02/08
Committee: CULT
Amendment 321 #
Proposal for a regulation
Article 7 – paragraph 1 – point b a (new)
(b a) the treatment of complaints issued in accordance with Article 10.
2019/02/08
Committee: CULT
Amendment 323 #
Proposal for a regulation
Article 7 – paragraph 2
2. The terrorist content and related data referred to in paragraph 1 shall be preserved for six months. The terrorist content shall, upon request from the competent authority or court, be preserved only for a longer period when and for as long as necessary for ongoing proceedings of administrative or judicial review referred to in paragraph 1(a).
2019/02/08
Committee: CULT
Amendment 327 #
Proposal for a regulation
Article 8 – paragraph 1
1. Hosting service providers shall set out in their terms and conditions their policy to collaborate with the competent judicial or independent administrative authorities and to prevent the dissemination of terrorist content, including, where appropriate, a meaningful explanation of the functioning of proactive measures including the use of automated tools.
2019/02/08
Committee: CULT
Amendment 330 #
Proposal for a regulation
Article 8 – paragraph 2
2. Hosting service providers and competent authorities and Union bodies shall publish annual transparency reports on action taken against the dissemination of terrorist content.
2019/02/08
Committee: CULT
Amendment 333 #
Proposal for a regulation
Article 8 – paragraph 3 – introductory part
3. Transparency reports of the hosting service providers shall include at least the following information:
2019/02/08
Committee: CULT
Amendment 339 #
Proposal for a regulation
Article 8 – paragraph 3 – point b
(b) detailed information about the hosting service provider’s measures to prevent the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist content;
2019/02/08
Committee: CULT
Amendment 341 #
Proposal for a regulation
Article 8 – paragraph 3 – point c
(c) number of pieces of illegal terrorist content removed or to which access has been disabled, following removal orders, referrals, or proactive measures, respectively;
2019/02/08
Committee: CULT
Amendment 351 #
Proposal for a regulation
Article 9 – paragraph 2
2. Safeguards shall consist, in particular, of human oversight and verifications wherof the appropriate and, in any event,ness of the decision to remove or deny access to content, in particular with regard to the right to freedom of expression and information. Human oversight shall be required where a detailed assessment of the relevant context is required in order to determine whether or not the content is to be considered terrorist content.
2019/02/08
Committee: CULT
Amendment 357 #
Proposal for a regulation
Article 10 – paragraph 1
1. Without prejudice to the remedies, including judicial, available to content providers under national law, Hosting service providers shall establish effective and accessible mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 6, to submit a substantiated complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/08
Committee: CULT
Amendment 359 #
Proposal for a regulation
Article 10 – paragraph 1
1. HThe relevant Union bodies and competent authorities shall, in co- operation with hosting service providers shall, establish effective and accessible mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 6, to submit a complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/08
Committee: CULT
Amendment 370 #
Proposal for a regulation
Article 11 – paragraph 2
2. Upon request of the content provider, tThe hosting service provider shall inform the content provider about the reasons for the removal or disabling of access and possibilities to contest the decision.
2019/02/08
Committee: CULT
Amendment 379 #
Proposal for a regulation
Article 13 – paragraph 4
4. Where hosting service providers become aware of any evidence of terrorist offences, they shall promptly inform the authoritiesy competent for the investigation and prosecution in criminal offences in the concerned Member State or the point of contact in the Member State pursuant to Article 14(2), where they have their main establishment or a legal representative. Hosting service providers may, in case of doubt, transmit this information to Europol for appropriate follow up.
2019/02/08
Committee: CULT
Amendment 387 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Each Member State shall designate thone authority or authorities competent to
2019/02/08
Committee: CULT
Amendment 391 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) oversee the implementapplication of proactive measures pursuant to Article 6;
2019/02/08
Committee: CULT
Amendment 394 #
Proposal for a regulation
Article 17 – paragraph 1 a (new)
1 a. Member States shall ensure that ‘competent authority’ means a national or European body with the power to issue, enforce and amend binding legal orders in their relevant jurisdictions.
2019/02/08
Committee: CULT
Amendment 398 #
Proposal for a regulation
Article 18 – paragraph 1 – point d
(d) Article 6(2) and (4) (reports on proactive measures and the adoption of measuressuch following a decision imposing specific proactive measures);
2019/02/08
Committee: CULT
Amendment 399 #
Proposal for a regulation
Article 18 – paragraph 1 – point g
(g) Article 9 (safeguards in relregarding the use and implementation tof proactive measures);
2019/02/08
Committee: CULT
Amendment 400 #
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
3. Member States shall ensure that, when determining the type and level of penalties, and those penalties should not be seen as criminal, the competent authorities take into account all relevant circumstances, including:
2019/02/08
Committee: CULT
Amendment 402 #
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
3. Member States shall ensure that, when determining the type and level of penalties, the competent authorities take into account all relevant circumstances, in particular in the case of SMEs, including:
2019/02/08
Committee: CULT
Amendment 408 #
Proposal for a regulation
Article 19 – paragraph 1
1. The Commission shall be empowered to adopt delegated acts in accordance with Article 20 in order to supplement this Regulation with the necessary technical requirements for the electronic means to be used by competent authorities for the transmission of removal orders.
2019/02/08
Committee: CULT
Amendment 409 #
Proposal for a regulation
Article 19 – paragraph 2
2. The Commission shall be empowered to adopt such delegated acts to amend Annexes I, II and III in order to effectivecompetently address a possible need for improvements regarding the content of removal order forms and of forms to be used to provide information on the impossibility to execute the removal order.
2019/02/08
Committee: CULT
Amendment 411 #
Proposal for a regulation
Article 20 – paragraph 2
2. The power to adopt delegated acts referred to in Article 19 shall be conferred on the Commission for an in determinate period of time3 years from [date of application of this Regulation].
2019/02/08
Committee: CULT
Amendment 414 #
Proposal for a regulation
Article 21 – paragraph 1 – point b
(b) information about the specific proactive measures taken pursuant to Article 6, including the amount of illegal terrorist content which has been removed or access to it disabled and the corresponding timeframes;
2019/02/08
Committee: CULT