BETA

23 Amendments of Birgit SIPPEL related to 2020/2022(INI)

Amendment 1 #
Motion for a resolution
Citation 3
— having regard to the Charter of Fundamental Rights of the European Union, in particular Article 6, Article 7, Article 8, Article 11, Article 13, Article 221, Article 22, Article 23, Article 24, Article 25 and Article 246 thereof,
2020/06/24
Committee: LIBE
Amendment 4 #
Motion for a resolution
Citation 6 a (new)
— having regard to Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive)3a, _________________ 3a OJ L 95, 15.4.2010, p. 1–24
2020/06/24
Committee: LIBE
Amendment 16 #
Motion for a resolution
Recital -A (new)
-A. whereas fundamental rights, such as protection of privacy and personal data, the principle of non-discrimination, as well as freedom of expression and information, need to be ingrained at the core of a successful and durable European policy on digital services; whereas these rights need to be seen both in the letter of the law, as well as the spirit of their implementation;
2020/06/24
Committee: LIBE
Amendment 17 #
Motion for a resolution
Recital A b (new)
Ab. recital -Aa whereas the trust of users can only be gained by digital services that respect their fundamental rights, thus ensuring both uptake of services, as well as a competitive advantage and stable business models for companies;
2020/06/24
Committee: LIBE
Amendment 20 #
Motion for a resolution
Recital B a (new)
Ba. whereas the privacy rules in the electronic communication sector, as set out in the Directive concerning the processing of personal data and the protection of privacy in the electronic communications sector, are currently under revision;
2020/06/24
Committee: LIBE
Amendment 25 #
Motion for a resolution
Recital C
C. whereas the amount of all types of user- generated content, including harmful and illegal content, shared via cloud services or online platforms has increased exponentially;
2020/06/24
Committee: LIBE
Amendment 30 #
Motion for a resolution
Recital C d (new)
Cd. whereas the Cambridge Analytica and Facebook scandals revealed how user data had been used to micro-target certain voters with political advertising, and at times, even with targeted disinformation, therefore showing the danger of opaque data processing operations of online platforms;
2020/06/24
Committee: LIBE
Amendment 31 #
Ce. whereas the widespread use of algorithms for content filtering and content removal processes also raises rule of law concerns, questions of legality, legitimacy and proportionality;
2020/06/24
Committee: LIBE
Amendment 58 #
Motion for a resolution
Recital J
J. whereas the lack of comparable, robust public data on the prevalence and both court mandated and self-regulatory removal of illegal and harmful content online creates a deficit of transparency and accountability;
2020/06/24
Committee: LIBE
Amendment 68 #
Motion for a resolution
Recital L
L. whereas according to the Court of Justice of the European Union (CJEU), jurisprudence host providers may have recourse to automated search tools and technologies to assess if content is equivalent to content previously declared unlawful, and should thuss long as it does not result in monitoring generally the information which it stores, or in actively seeking facts or circumstances indicating illegal activity, as provided for in Article 15(1) of Directive 2000/31; whereas such content should be removed following an court order from a Member State;
2020/06/24
Committee: LIBE
Amendment 81 #
Motion for a resolution
Paragraph 1
1. Stresses that illegal content online should be tackled with the same rigour as illegal content offlineis the same as illegal content offline; takes therefore the position that any legally mandated content moderation measure in the Digital Services Act should concern only illegal content, as it is defined in European or national law, and the legislative text should not include any legally vague and undefined terms, such as “harmful content”, as targeting such content would put fundamental rights and freedom of speech at serious risk and put the service providers in a legally unclear position;
2020/06/24
Committee: LIBE
Amendment 95 #
Motion for a resolution
Paragraph 2
2. Believes in the clear economic benefits of a functioning digital single market for the EU and its Member States; stresses the important obligation to ensure a fair digital ecosystem in which fundamental rights and, especially data protection are respected; calls for a minimum level of intervention based on the principles of necessity and proportionality, privacy and non- discrimination are at its core;
2020/06/24
Committee: LIBE
Amendment 104 #
Motion for a resolution
Paragraph 3
3. Deems it necessary that illegal content is removed swiftly and consistently in order to address crimes and fundamental rights violation, through a clear and harmonised notice-and-action procedure with the necessary safeguards in place, such as transparency of the process, the right to appeal and access to effective judicial redress; considers that voluntary codes of conduct only partially address the issue;
2020/06/24
Committee: LIBE
Amendment 122 #
Motion for a resolution
Paragraph 5
5. Acknowledges the fact that, while the illegal nature of certain types of content can be easily established, the decision is more difficult for other types of content as it requires contextualisation; warns that some automated tools are not sophisticated enough to take contextualisation into account, which could lead to unnecessary restrictions being placed on the freedom of expressionreminds in this regard of the incapacity of current automated tools in grasping the importance of context for specific pieces of content, underlines that algorithms are not currently capable of critical analysis, and takes therefore the view that the Digital Services Act should not contain any obligation for compulsory use of automated tools in content moderation; believes that any voluntary automated measures put in place by the content hosting platforms should be subject to extensive human oversight and to full transparency of design and performance;
2020/06/24
Committee: LIBE
Amendment 139 #
Motion for a resolution
Paragraph 7 a (new)
7a. Highlights that the practical capacity of individuals to understand and navigate the complexity of the data ecosystems in which they are embedded is extremely limited, as is their ability to identify whether the information they receive and services they use are made available to them on the same terms as to other users; Calls on the Commission therefore to place transparency and non- discrimination at the heart of the Digital Services Act;
2020/06/24
Committee: LIBE
Amendment 148 #
Motion for a resolution
Paragraph 9
9. Calls, to this end, for legislative proposals that keepthat the digital single market is kept open and competitive by requiring digital service providers to apply effective, coherent, transparent and fair procedures andwith robust procedural safeguards to remove illegal content in line with European values; firmly believes that this should be harmonised within the digital single marketvia a harmonised notice-and-action procedure in line with European legislation;
2020/06/24
Committee: LIBE
Amendment 155 #
Motion for a resolution
Paragraph 10
10. Believes, in this regard, that online platforms that are actively hosting or moderating content should bear more, yet proportionate, responsibility for the infrastructure they provide and the content on it; emphasises that this should be achieved without resorting toit is crucial for online platforms to have clarity provided for by setting clear rules, requirements and safeguards for a harmonised notice-and-action procedure; emphasises that any measure put in place for the removal of illegal content cannot constitute or imply a general monitoring requirements;
2020/06/24
Committee: LIBE
Amendment 167 #
Motion for a resolution
Paragraph 12
12. Stresses the need for appropriate safeguards and due process obligations, including human oversight and verification, in addition to counter notice procedures, to ensure that removal or blocking decisions are accuratelegal, well- founded and respect fundamental rights; recalls that the possibility of judicial rwhile counter-notice proceduress should be mad, complaint mechanisms and out-of-court dispute settlements can be availuable to satisfy the right to effectiveols in protecting fundamental rights of the users of digital services, they cannot preclude access to effective judicial redress and remedy;
2020/06/24
Committee: LIBE
Amendment 192 #
Motion for a resolution
Paragraph 14
14. Believes that the terms of services of digital service providers should be clear, transparent and fair; deplores the fact that some terms of servrecalls that any take- down-notices from content platforms do not allow law enforcement to use non-personal accounts, which poses a threat both to possible investigations and to personal safetyan authority has to always be based on law, not on the terms of service of the service providers;
2020/06/24
Committee: LIBE
Amendment 208 #
Motion for a resolution
Paragraph 15 b (new)
15b. Is concerned of platforms and services that deliberately lock in their users onto that specific platform, thus amplifying their dominant market power and their ability to profile their users even more thoroughly, creating extremely invasive and revealing profiles of their users; calls therefore on the Commission to guarantee the interoperability of digital services; considers in this regard the application programming interfaces (APIs), enabling a user to interconnect between platforms and to import content moderation rules on the content they view on a platform, to be useful tools in bringing true interoperability to users and thus increasing their options to choose between different kinds of recommendation systems and services;
2020/06/24
Committee: LIBE
Amendment 211 #
Motion for a resolution
Paragraph 16
16. Deems that accountability- andUnderlines the wedge between the speed and capacity of machines relative to the capacity of humans to monitor these machines; therefore deems that accountability always lies with the human overseers - and calls for evidence-based policy making, requiresing robust data on the prevalence and removal of illegal content online, in order to ensure a transparent system that can be trusted by all;
2020/06/24
Committee: LIBE
Amendment 226 #
Motion for a resolution
Paragraph 19
19. Expresses its concern regarding the fragmentation of public oversight and supervision of platforms and the frequentdocumented lack of financial and human resources for the supervision and oversight bodies needed to properly fulfil their tasks; calls for increased cooperation with regard to regulatory oversight of digital services;
2020/06/24
Committee: LIBE
Amendment 228 #
Motion for a resolution
Paragraph 19 a (new)
19a. Considers that in order to guarantee proper enforcement of the Digital Services Act, the oversight of compliance with this Act should be entrusted in an independent authority, while any decisions relating to content should always remain with the judiciary; emphasises in this regard that sanctioning for non-compliance with the Digital Services Act should be based on an assessment of a clearly defined set of factors, such as proportionality, technical and organisational measures and negligence, and the resulting sanctions should be based on a percentage of the annual global turnover of a company;
2020/06/24
Committee: LIBE