BETA

50 Amendments of Helga TRÜPEL related to 2018/0331(COD)

Amendment 66 #
Proposal for a regulation
Recital 5
(5) The application of this Regulation should not affect the application of Article 14 of Directive 2000/31/EC8 . In particular, any measures taken by the hosting service provider in compliance with this Regulation, including any proactive measures, should not in themselves lead to that service provider losing the benefit of the liability exemption provided for in that provision, recalling that Article 14 requires service providers to act expeditiously to remove or to disable access to illegal content upon receiving knowledge of illegal activity or information. This Regulation leaves unaffected the powers of national authorities and courts to establish liability of hosting service providers in specific cases where the conditions under Article 14 of Directive 2000/31/EC for liability exemption are not met. _________________ 8 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
2019/02/08
Committee: CULT
Amendment 70 #
Proposal for a regulation
Recital 6 a (new)
(6 a) The obligations laid down in this Regulation should not affect the duty and ability of national authorities and courts to take appropriate, reasonable and proportionate actions against criminal offences in accordance with national law.
2019/02/08
Committee: CULT
Amendment 71 #
Proposal for a regulation
Recital 7
(7) This Regulation contributes to the protection of public security while establishing appropriate and robust safeguards to ensure protection of the fundamental rights at stake. This includes the rights to respect for private life and to the protection of personal data, the right to effective judicial protection, the right to freedom of expression, including the freedom to receive and impart information, the freedom to conduct a business, and the principle of non-discrimination. Competent authorities and hosting service providers should only adopt measures which are necessary, appropriate and proportionate within a democratic society, taking into account the particular importance accorded to the freedom of expression and information, as well as the freedom of the press and pluralism of the media, which constitutes one of the essential foundations of a pluralist, democratic society, and is one of the values on which the Union is founded. Measures constituting interference ipotentially impacting on the freedom of expression and information should be strictly targeted, in the sense that they must and can only be based on judicial order, in the sense that they must relate to specific content and serve to prevent the dissemination of terrorist content, but without thereby affecting the right to lawfully receive and impart information, taking into account the central role of hosting service providers in facilitating public debate and the distribution and receipt of facts, opinions and ideas in accordance with the law. To underline the important role that professionals participating in the preparation, production and dissemination of press or media content have for the information and opinion forming of the public, these persons need special safeguards to ensure that their work is not jeopardised by decisions to remove terrorist content or disable access to it.
2019/02/08
Committee: CULT
Amendment 82 #
Proposal for a regulation
Recital 9
(9) In order to provide clarity about the actions that both hosting service providers and competent authorities should take to prevent the dissemination of terrorist content online to the general public, this Regulation should establish a definition of terrorist content for preventative purposes drawing onin line with the definition of terrorist offences under Directive (EU) 2017/541 of the European Parliament and of the Council9 and should in particular consider links with a recognised terrorist organisation as listed by the EU or UN on designated lists. Given the need to address the most harmful terrorist propaganda online, the definition should capture material and information that incites, encourages or advocates the commission or contribution, including financial or logistical, to terrorist offences, provides instructions for the commission of such offences or promotes the participation in activities of aor dissemination of content related to activities of an EU or UN listed terrorist group. Such information includes in particular text, images, sound recordings and videos. When assessing whether content constitutes terrorist content within the meaning of this Regulation, competent authorities as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made and their potential to lead to harmful consequences, thereby affecting the security and safety of persons. The fact that the material was produced by, is attributable to or disseminated on behalf of an EU-listed terrorist organisation or person constitutes an important factor in the assessment. Content disseminated for educational, research, journalistic or research purposes should be adequately protectedand other editorial purposes should not however be considered as terrorist content, and should therefore be excluded from the scope of this Regulation, provided that it does not incite the commission of violence, in order to ensure a fair balance between fundamental rights including in particular the freedom of expression and information and public security needs. This is necessary in order to take into account the journalistic standards established by press or media regulation. To underline the important role that professionals participating in the preparation, production and dissemination of press or media content have for the information and opinion forming of the public, these persons need special protection to ensure that their work is not jeopardised by decisions to remove terrorist content or disable access to it. Furthermore, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content, provided that it does not incite the commission of violence. _________________ 9Directive (EU) 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA (OJ L 88, 31.3.2017, p. 6).
2019/02/08
Committee: CULT
Amendment 90 #
Proposal for a regulation
Recital 9 a (new)
(9 a) Where the disseminated material is published under the editorial responsibility of a content provider, any decision as to the removal of such content can only be made based on a juridical order. This is necessary in order to fully respect the law of the Union and the right to freedom of expression and the right to freedom and pluralism of the media as enshrined in Article 11 of the Charter of Fundamental Rights.
2019/02/08
Committee: CULT
Amendment 91 #
Proposal for a regulation
Recital 10
(10) In order to cover those online hosting services where therrorist content is disseminatede is widespread dissemination of terrorist content, this Regulation should apply to information society services which store information and material provided by a recipient of the service at his or her request and in making the information stored available to third parties, irrespective of whether this activity is of a mere technical, automatic and passive naturmake such information and material stored available to multiple end users of the hosting service provider or to the general public. This Regulation applies to the activity of providing hosting services, rather than the specific provider or its dominant activity, which might combine hosting services with other services that are not in the scope of this Regulation. Storing information for purposes of this Regulation consists of holding data in the memory of a physical or virtual server; this excludes mere conduits and other electronic communication services within the meaning of [European Electronic Communication Code], providers of caching services, other services provided in other layers of the cloud IT infrastructure services, such as VPS (Virtual Private Servers), bare metal servers, containers, registries and registrars, DNS (domain name system) or adjacent services, payment services or DDoS (distributed denial of service) protection services’. Further, the information has to be stored at the request of the online content provider; only those online hosting services for which the online content provider is the direct recipient are in scope. Finally, the information stored must be made available to the general public or to multiple end- users of the hosting service provider, which is understood as any end-user of the hosting service provider who is not the online content provider. Interpersonal communication services that enable direct interpersonal and interactive exchange of information between a finite number of persons, whereby the persons initiating or participating in the communication determine its recipient(s), are not in scope. By way of example such hosting service providers of information society services include social media platforms, video-sharing platforms, video streaming services, video, image and audio sharing services, public file sharing services, and other cloud services to the extent they make the information available to third parties and websites where users can make comments or post reviewsand storage services (excluding cloud IT infrastructure), to the extent they make the information or material directly available to multiple end users of the hosting service provider or to the general public. The Regulation should also apply to hosting service providers established outside the Union but offertargeting services within the Union, since a significant proportion of hosting service providers exposed to terrorist content on their services are established in third countries. This should ensure that all companies operating in the Digital Single Market comply with the same requirements, irrespective of their country of establishment. The determination as to whether a service provider offers services in the Union requires an assessment whether the service provider enables legal or natural persons in one or more Member States to use its services. However, the mere accessibility of a service provider’s website or of an email address and of other contact details in one or more Member States taken in isolation should not be a sufficient condition for the application of this Regulation. By contrast, web hosting service providers that provide the technical infrastructure to website operators, including in order to allow the deployment and functioning of the website, are not covered by this Regulation.
2019/02/08
Committee: CULT
Amendment 98 #
Proposal for a regulation
Recital 12
(12) Hosting service providers should apply certain duties of care, in order to prevent and deter the dissemination of terrorist content on their services. These duties of care should not amount to a general monitoring obligation. Duties of care should include that, when applying this Regulation, hosting services providers act in a diligent, proportionate and non- discriminatory manner in respect of content that they store, in particular when implementing their own terms and conditions, with a view to avoiding removal of content which is not terrorist. The removal or disabling of access has to be undertaken in the observance of and respect for freedom of expression and information. Effective and expeditious complaints and redress mechanisms should be available by the hosting service providers in the case of unjustified removals of content.
2019/02/08
Committee: CULT
Amendment 102 #
Proposal for a regulation
Recital 13
(13) The procedure and obligations resulting from legal orders requesting hosting service providers to remove terrorist content or disable access to it, following an assessment by the competent authorities, should be harmonised. Member States should remain free as to the choice of the competent authorities allowing them to designate administrative, law enforcement or judicial authorities with that task. Given the speed at which terrorist content is disseminated across online services, this provision imposes obligations on hosting service providers to ensure that terrorist content identified in the removal order is removed or access to it is disabled within one hour from receiving the removal order. In case of delays the nature and size of the hosting service providers should be taken into account, particularly in the case of microenterprises or small-sized enterprises. Given the disproportional high level of harm that a terrorist content can cause to the public or to the public order of a Member State, because of its high level of violence or its link to an on- going or very recent terrorist offence committed in the Member State concerned, Member States should be allowed in these cases to impose obligations on hosting service providers to ensure that the terrorist content identified in the duly justified removal order is removed or access to it is disabled immediately from receiving the removal order. It is for the hosting service providers to decide whether to remove the content in question or disable access to the content for users in the Union.
2019/02/08
Committee: CULT
Amendment 122 #
Proposal for a regulation
Recital 16
(16) Given the scale and speed necessary for effectively identifying and removing terrorist content, proportionate proactive measures, including by using automated means in certain cases, are an essential element in tackling terrorist content online. With a view to reducing the accessibility of terrorist content on their services, hosting service providers should assess whether it is appropriate to take, effective and proportionate to take targeted proactive measures depending on the risks and level of exposure to terrorist content as well as to the effects on the rights of third parties and the public interest of information. Consequently, hosting service providers should determine what appropriate, effective and proportionate proactive measure should be put in place. This requirement should not imply a general monitoring obligation. In the context of this assessment, the absence of removal orders and referrals addressed to a hosting provider, is an indication of a low level of exposure to terrorist content.
2019/02/08
Committee: CULT
Amendment 126 #
Proposal for a regulation
Recital 17
(17) When putting in place proactive measures, hosting service providers should ensure that users’ right to freedom of expression and information - including to freely receive and impart information - is preserved. In addition to any requirement laid down in the law, including the legislation on protection of personal data, hosting service providers should act with due diligence and implement safeguards, including notably human oversight and verifications, where appropriate, to avoid any unintended and erroneous decision leading to removal of content that is not terrorist content. This is of particular relevance when hosting service providers use automated means to detect terrorist content. Any decision to use automated means, whether taken by the hosting service provider itself or pursuant to a request by the competent authority, should be assessed with regard to the reliability of the underlying technology and the ensuing impact on fundamental rights. Hosting service providers should put in place effective and expeditious complaints and redress mechanisms to address cases of unjustified removals of content.
2019/02/08
Committee: CULT
Amendment 129 #
Proposal for a regulation
Recital 18
(18) In order to ensure that hosting service providers exposed to terrorist content take appropriate measures to prevent the misuse of their services, the competent authorities should request hosting service providers having received a removal order, which has become final, to report on the proactive measures taken, as well as on the functioning of the complaints and redress mechanisms. These could consist of measures to prevent the re-upload of terrorist content, removed or access to it disabled as a result of a removal order or referrals they received, checking against publicly or privately-held tools containing known terrorist content. They may also employ the use of reliable technical tools to identify new terrorist content, for instance where it uses in part or whole terrorist content that is already subjected to a definitive removal order or where it is uploaded by users who already uploaded terrorist content, either using those available on the market or those developed by the hosting service provider. The service provider should report on the specific proactive measures in place in order to allow the competent authority to judge whether the measures are effective and proportionate and whether, if automated means are used, the hosting service provider has the necessary abilities for human oversight and verification. In assessing the effectiveness and proportionality of the measures, competent authorities should take into account relevant parameters including the number of removal orders and referrals issued to the provider, their economic capacity and the impact of its service in disseminating terrorist content (for example, taking into account the number of users in the Union).
2019/02/08
Committee: CULT
Amendment 136 #
Proposal for a regulation
Recital 19
(19) Following the request, the competent authority should enter into a dialogue with the hosting service provider about the necessary proactive measures to be put in place. If necessary, the competent authority should impose the adoption of appropriate, effective and proportionate proactive measures where it considers that the measures taken are insufficient to meet the risks. A decision to impose such specific proactive measures should not, in principle, lead to the imposition of a general obligation to monitor, as provided in Article 15(1) of Directive 2000/31/EC. Considering the particularly grave risks associated with the dissemination of terrorist content, the decisions adopted by the competent authorities on the basis of this Regulation could derogate from the approach established in Article 15(1) of Directive 2000/31/EC, only as regards certain specific, targeted measures, the adoption of which is necessary for overriding public security reasons. Before adopting such decisions, the competent authority should strike a fair balance between the public interest objectives and the fundamental rights involved, in particular, the freedom of expression and information and the freedom to conduct a business, and provide appropriate justification.
2019/02/08
Committee: CULT
Amendment 161 #
Proposal for a regulation
Recital 37
(37) For the purposes of this Regulation, Member States should designate competent authorities, including judicial, with the relevant expertise. The requirement to designate competent authorities does not necessarily require the establishment of new authorities but can be existing bodies tasked with the functions set out in this Regulation. This Regulation requires designating authorities competent for issuing removal orders, referrals and for overseeing proactive measures and for imposing penalties. It is for Member States to decide how many authorities they wish to designate for these tasks. Member States should notify to the European Commission the competent authorities they designated for the purpose of this Regulation.
2019/02/08
Committee: CULT
Amendment 174 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down uniform rules to prevent the misuse of hosting services for the dissemination to the general public of terrorist content online. It lays down in particular:
2019/02/08
Committee: CULT
Amendment 179 #
Proposal for a regulation
Article 1 – paragraph 1 – point a
(a) rules on duties of care to be applied by hosting service providersproviders of services covered by this Regulation in order to prevent the dissemination of terrorist content to the general public through their services and ensure, where necessary, its swift removal;
2019/02/08
Committee: CULT
Amendment 185 #
Proposal for a regulation
Article 1 – paragraph 1 – point b
(b) a set of measures to be put in place by Member States to identify terrorist content, to enable its swift removal by hosting service providers and to facilitate cooperation with the relevant competent authorities in other Member States, hosting service providers and where appropriate relevant Union bodies.
2019/02/08
Committee: CULT
Amendment 188 #
Proposal for a regulation
Article 1 – paragraph 2
2. This Regulation shall apply to hosting service providers offering servicesproviders of services, as defined in this Regulation, offering services targeted to the general public in the Union, irrespective of their place of main establishment.
2019/02/08
Committee: CULT
Amendment 191 #
Proposal for a regulation
Article 1 a (new)
Article 1 a This Regulation must be applied in full respect fundamental rights enshrined in Article 6 of the Treaty of the European Union.
2019/02/08
Committee: CULT
Amendment 192 #
Proposal for a regulation
Article 2 – paragraph 1 – point 1
(1) 'hosting service provider' means a provider of information society services consisting in the storage of information provided by and at the request of the online content provider and in making the information stored available to third parties;e general public, excluding web hosting service providers that provide the technical infrastructure to website operators. For purposes of this Regulation, the following services are excluded from the scope: mere conduits, electronic communication services, caching services, cloud IT infrastructure services, protection services, registries, registrars, DNS, payment services, DDoS and interpersonal communication services. 'Cloud infrastructure services' which consist in the provision of on demand physical or virtual resources that provide computing and storage infrastructure capabilities on which the service provider has no contractual rights as to what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no technical capability to remove specific content stored by their customers or the end-users of their customers, shall not be considered within the meaning and for the purposes of this Regulation.
2019/02/08
Committee: CULT
Amendment 197 #
Proposal for a regulation
Article 2 – paragraph 1 – point 2
(2) 'content provider' means a user who has provided information that is, or that has been, stored at the request of the user by a hosting service providerof the services of an hosting service provider who has provided content to such hosting service provider for the purpose of disseminating such content to the general public;
2019/02/08
Committee: CULT
Amendment 202 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – introductory part
(5) 'terrorist content' means one or more of the following informationany publicly available information or material, other than material used for educational, research, journalistic and other editorial purposes provided that it does not incite the commission of violence, which is connected to a terrorist organisation as listed by the EU or UN on designated lists, and which may contribute to the commission of intentional acts, which constitute offences under national law, as listed in Article 3(1)(a) to (i) of Directive 2017/741/EU, by:
2019/02/08
Committee: CULT
Amendment 220 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point c
(c) promoting the activities of a terrorist group, in particular by encouraging the participation in, meeting with, communicate with or support to a terrorist group within the meaning of Article 2(3) of Directive (EU) 2017/541, or by encouraging the dissemination of terrorist content;
2019/02/08
Committee: CULT
Amendment 229 #
Proposal for a regulation
Article 2 – paragraph 1 – point 5 – point d a (new)
(d a) the expression of radical, polemic or controversial views in the public debate on sensitive political questions shall however not be considered terrorist content.
2019/02/08
Committee: CULT
Amendment 235 #
Proposal for a regulation
Article 2 – paragraph 1 – point 6
(6) ‘dissemination of terrorist content’ means making terrorist content publicly available to third parties on the hosting service providers’ services;
2019/02/08
Committee: CULT
Amendment 242 #
Proposal for a regulation
Article 2 – paragraph 1 – point 9 a (new)
(9 a) ‘competent authority’ means a body, including judicial, with the relevant expertise designated or created by the Member State for the purpose of this Regulation.
2019/02/08
Committee: CULT
Amendment 245 #
Proposal for a regulation
Article 3 – paragraph 1
1. Hosting service providers shall take appropriate, reasonable and proportionate actions in accordance with this Regulation, against the public dissemination of terrorist content and to protect users from terrorist content. In doing so, they shall act in a diligent, proportionate and non- discriminatory manner, and with due regard to striking a balance withe fundamental rights of the users such as protection of private life, data protection, and secrecy of correspondence, and take into account the fundamental importance of the freedom of expression and information in an open and democratic society.
2019/02/08
Committee: CULT
Amendment 248 #
Proposal for a regulation
Article 3 – paragraph 2
2. Hosting service providers shall include in their terms and conditions, and apply, effective and proportionate provisions to prevent the storing and dissemination of terrorist content on their services.
2019/02/08
Committee: CULT
Amendment 252 #
Proposal for a regulation
Article 4 – paragraph 1
1. The competent authority shall have the power to issue a decisionremoval order requiring the hosting service provider to remove terrorist content or disable access to it. If material is published under the editorial responsibility of a content provider, any removal order can only become effective based on a judicial order.
2019/02/08
Committee: CULT
Amendment 256 #
Proposal for a regulation
Article 4 – paragraph 2
2. Hosting service providers shall remove terrorist content or disable access to it within one hour from receipt of the removal order. Member States may provide that where a terrorist content is manifestly harmful or constitutes an immediate threat to the public order, hosting service providers shall remove or disable access to the terrorist content from the moment of receipt of a duly justified removal order.
2019/02/08
Committee: CULT
Amendment 268 #
Proposal for a regulation
Article 4 – paragraph 3 – point c
(c) an exact online Uniform Resource Locator (URL), and, where necessary, additional identification of the online content provider and any other information enabling the identification of the content referred, and information enabling the identification of the content referred;
2019/02/08
Committee: CULT
Amendment 273 #
Proposal for a regulation
Article 4 – paragraph 3 – point f
(f) information about redress and associated deadlines available to the hosting service provider and to the content provider;
2019/02/08
Committee: CULT
Amendment 282 #
Proposal for a regulation
Article 4 – paragraph 7
7. If the hosting service provider cannot comply with the removal order because of force majeure, or of de factof logistical impossibility not attributable to the hosting service providerdue to its size and capacities, or of de facto impossibility or impracticability, it shall inform, without undue delay, the competent authority, explaining the reasons, using the template set out in Annex III. TParagraph 2 shall apply as soon as the dreadline set out in paragraph 2 shall apply as soon as the reasons invoked are no longer presentsons invoked are no longer present, except when the hosting service provider cannot comply with the removal order because of technical impracticability or because it would have disproportionate effects on the service or its users or on the rights of the users such as protection of private life, data protection, and secrecy of correspondence, the freedom of expression and information in an open and democratic society and the freedom to conduct a business.
2019/02/08
Committee: CULT
Amendment 288 #
Proposal for a regulation
Article 4 – paragraph 8
8. If the hosting service provider cannot comply with the removal order because the removal order contains manifest errors or does not contain sufficient technical information to execute the order, it shall inform the competent authority without undue delay, asking for the necessary clarification, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the clarification is provided.
2019/02/08
Committee: CULT
Amendment 295 #
Proposal for a regulation
Article 4 a (new)
Article 4 a 1. The issuing authority shall submit a copy of the removal order to the competent authority referred to in Article 17(1)(a) of the Member State in which the main establishment of the hosting service provider is located at the same time it is transmitted to the hosting service provider in accordance with Article 4(5). 2. In cases where the competent authority of the Member State in which the main establishment of the hosting service provider is located has reasonable grounds to believe that the removal order may impact fundamental interests and fundamental rights, it shall inform the issuing competent authority. 3. The competent authority shall take these circumstances into account and shall where necessary, withdraw or adapt the removal order.
2019/02/08
Committee: CULT
Amendment 306 #
Proposal for a regulation
Article 6 – paragraph 1
1. Hosting service providers shall, where appropriate, take proactive measures to protect their services against the dissemination of terrorist content. The measures shall be effective, targeted and proportionate, taking into account the risk and level of exposure to terrorist content, and strike a balance with the fundamental rights of the users, and the fundamental importance of the freedom of expression and information in an open and democratic society.
2019/02/08
Committee: CULT
Amendment 309 #
Proposal for a regulation
Article 6 – paragraph 2 – subparagraph 1 – point a
(a) effectively preventing the re-upload of content which has previously been removed or to which access has been disabled because it is considered to be terrorist content;
2019/02/08
Committee: CULT
Amendment 312 #
Proposal for a regulation
Article 6 – paragraph 3
3. Where the competent authority referred to in Article 17(1)(c) considers that the proactive measures taken and reported under paragraph 2 are disproportionate or are insufficient in mitigating and managing the risk and level of exposure, it may request the hosting service provider to adapt the measures already taken or to take specific additional proactive measures. For that purpose, the hosting service provider shall cooperate with the competent authority referred to in Article 17(1)(c) with a view to identifying the changes or specific measures that the hosting service provider shall put in place, establishing key objectives and benchmarks as well as timelines for their implementation.
2019/02/08
Committee: CULT
Amendment 315 #
Proposal for a regulation
Article 6 – paragraph 4
4. Where no agreement can be reached within the three months from the request pursuant to paragraph 3, the competent authority referred to in Article 17(1)(c) may issue a decision imposing specific additional necessary and proportionate proactive measures. The decision shall take into account, in particular, the type of content hosted on the service, the technical feasibility of the measures, the economic capacity of the hosting service provider and the effect of such measures on the fundamental rights of the users and the fundamental importance of the freedom of expression and information. Such a decision shall be sent to the main establishment of the hosting service provider or to the legal representative designated by the service provider. The hosting service provider shall regularly report on the implementation of such measures as specified by the competent authority referred to in Article 17(1)(c).
2019/02/08
Committee: CULT
Amendment 317 #
Proposal for a regulation
Article 6 – paragraph 5 a (new)
5 a. Article 6 and Article 9 shall not apply to providers of cloud infrastructure services which consist in the provision of on demand physical or virtual resources that provide computing and storage infrastructure capabilities on which the service provider has no rights as to what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no specific control of the content stored by their customers or the end-users of their customers.
2019/02/08
Committee: CULT
Amendment 320 #
Proposal for a regulation
Article 7 – paragraph 1 – point b a (new)
(b a) the treatment of complaints issued in accordance with Article 10.
2019/02/08
Committee: CULT
Amendment 332 #
Proposal for a regulation
Article 8 – paragraph 2
2. Hosting service providers shall publish annual transparency reports on action taken against the dissemination of terrorist content to the general public.
2019/02/08
Committee: CULT
Amendment 344 #
Proposal for a regulation
Article 8 – paragraph 3 – point d
(d) overview and outcome ofassessment of the effectiveness of the complaint procand redures.s mechanisms
2019/02/08
Committee: CULT
Amendment 349 #
Proposal for a regulation
Article 9 – paragraph 1
1. Where hosting service providers use automated tools pursuant to this Regulation in respect of content that they store, they shall provide effective and appropriate safeguards to ensure that decisions taken concerning that content, in particular decisions to remove or disable access to content considered to be terrorist content, are accurate and well-founded.
2019/02/08
Committee: CULT
Amendment 352 #
Proposal for a regulation
Article 9 – paragraph 2
2. Safeguards shall consist, in particular, of human oversight and verifications wherof the appropriate and, in any event,ness of the decision to remove or deny access to content, in particular with regard to the right to freedom of expression and information. Human oversight shall be required where a detailed assessment of the relevant context is required in order to determine whether or not the content is to be considered terrorist content.
2019/02/08
Committee: CULT
Amendment 356 #
Proposal for a regulation
Article 10 – title
Complaint and redress mechanisms
2019/02/08
Committee: CULT
Amendment 361 #
Proposal for a regulation
Article 10 – paragraph 1
1. Hosting service providers shall establish expeditious, effective and accessible complaints and redress mechanisms allowing content providers whose content has been removed or access to it disabled as a result of a referral pursuant to Article 5 or of proactive measures pursuant to Article 6, to submit a substantiated complaint against the action of the hosting service provider requesting reinstatement of the content.
2019/02/08
Committee: CULT
Amendment 367 #
Proposal for a regulation
Article 11 – paragraph 1
1. Where hosting service providers removed terrorist content or disable access to it, they shall make available to the content provider information on the removal or disabling of access to terrorist content within 24 hours.
2019/02/08
Committee: CULT
Amendment 372 #
Proposal for a regulation
Article 12 a (new)
Article 12 a In cases where content has been removed or access to it disabled as a result of a removal order to Article 4, a referral pursuant to Article 5 or proactive measures pursuant to Article 6, the content provider concerned can initiate judicial proceedings at any time requesting re-instatement of the content. Initiation of judicial proceedings is not conditional on the initiation of complaint mechanisms referred to in Article 10.
2019/02/08
Committee: CULT
Amendment 401 #
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
3. Member States shall ensure that, when determining the type and level of penalties, the competent authorities take into account all relevant circumstances, in particular in the case of SMEs, including:
2019/02/08
Committee: CULT
Amendment 405 #
Proposal for a regulation
Article 18 – paragraph 3 – point e a (new)
(e a) the nature and size of the hosting service providers, in particular microenterprises or small-sized enterprises, within the meaning of the Commission recommendation 2003/361/EC.
2019/02/08
Committee: CULT