Activities of Cornelia ERNST related to 2020/2022(INI)
Shadow reports (1)
REPORT on the Digital Services Act and fundamental rights issues posed
Amendments (40)
Amendment 2 #
Motion for a resolution
Citation 3
Citation 3
— having regard to the Charter of Fundamental Rights of the European Union, in particular Article 6, Article 7, Article 8, Article 11, Article 13, Article 221, Article 22, Article 24 and Article 2438 thereof,
Amendment 7 #
Motion for a resolution
Citation 7
Citation 7
Amendment 15 #
Motion for a resolution
Citation 8
Citation 8
— having regard to the judgment of the Court of Justice of 3 October 2019 in case C-18/185 , _________________ 5Judgment of the Court of Justice of 3⁰October 2019, Eva Glawischnig- Piesczek v Facebook Ireland Limited, C- 18/18, ECLI:EU:C:2019:821.relevant case law of the Court of Justice of the European Union,
Amendment 19 #
Motion for a resolution
Recital B
Recital B
B. whereas the data protection rules applicable to all providers offering digital services in the EU’s territory were recently updated and harmonised across the EU with the General Data Protection Regulation, its enforcement needs to be strengthened;
Amendment 23 #
Motion for a resolution
Recital C
Recital C
C. whereas the amount of user- generated content, including harmful and illegal contentvices available and users' activities, including illegal services and activities, shared via cloud services or online platforms, has increased exponentially;
Amendment 35 #
Motion for a resolution
Recital D
Recital D
D. whereas a small number of mostly non-European service providers have significant market power and exert influence over suppliers and control how information, services and products are presented, thereby having an impact on the rights and freedoms of individuals, and our societies and democracies;
Amendment 38 #
Motion for a resolution
Recital E
Recital E
E. whereas the politicalcy approach to tackle harmful and illegal contentactivities online in the EU has mainly focused on voluntary cooperation thus far, but a growing number of Member States are adopting national legislation to address illegal content;
Amendment 46 #
Motion for a resolution
Recital F
Recital F
F. whereas some forms of harmful content may be legal, yet detrimental to society or democracy, with examples such as opaque political advertising and disinformation on COVID-19 causes and remedies;
Amendment 51 #
Motion for a resolution
Recital G
Recital G
G. whereas a pure self-regulatory approach of platforms does not provide adequate transparency to public authorities, civil society and users on how platforms address illegal and harmful contentctivities; whereas such an approach does not guarantee compliance with fundamental rights;
Amendment 55 #
Motion for a resolution
Recital I
Recital I
I. whereas the absence of uniform and transparent rules for procedural safeguards across the EU is a key obstacle for persons affected by illegal contentactivities online and content providers, including users, seeking to exercise their rights;
Amendment 60 #
Motion for a resolution
Recital J a (new)
Recital J a (new)
Ja. whereas persons of colour, persons belonging to or who are perceived to belong to ethnic or linguistic minorities, asylum seekers, migrants, LGBTIQ persons and women often experience high levels of discriminatory hate speech, bullying, threats and scapegoating online and run high risks of experiencing so- called "shit storms";
Amendment 61 #
Motion for a resolution
Recital J b (new)
Recital J b (new)
Jb. whereas algorithms used for automated decision-making or profiling often reproduce existing discriminatory patterns in society, thereby leading to a high risk of exacerbated discrimination for persons already affected.
Amendment 63 #
Motion for a resolution
Recital K
Recital K
Amendment 70 #
Motion for a resolution
Recital L
Recital L
L. whereas according to the Court of Justice of the European Union (CJEU), jurisprudence, host providers may have recourse to automated search tools and technologies to assess if content is equivalent to content previously declared unlawful, and should thus be removed following an order from a Member State, but they are not obliged to use such automated tools;
Amendment 80 #
Motion for a resolution
Paragraph 1
Paragraph 1
Amendment 84 #
Motion for a resolution
Paragraph 1 a (new)
Paragraph 1 a (new)
Amendment 96 #
Motion for a resolution
Paragraph 2
Paragraph 2
2. Believes in the clear economic benefits of a functioning digital single market for the EU and its Member States; stresses the important obligation to ensure a fair digital ecosystem in which fundamental rights and, including data protection, are respected; calls for a minimum level ofcomprehensive and effective regulatory intervention based on the principles of necessity and proportionality;
Amendment 107 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Deems it necessary that illegal content isactivities are removed swiftly and consistently in order to address crimelaw infringements and fundamental rights violations; considers that voluntary codes of conduct only partiallylack adequate enforcement and have proven to be inefficient in addressing the issue;
Amendment 111 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Recalls that illegal contentinformation, services and products online should not only be removed by online platforms, but should be followed up by law enforcement and the judiciary; finds, in this regard, that a key issue in some Member States is not that they have unresolved cases but rather unopened onescalls on the Commission to consider obliging major hosting service providers to report serious crime to the competent law enforcement authority, upon obtaining actual knowledge of such a crime; calls for barriers to filing complaints with competent authorities to be removed; is convinced that, given the borderless nature of the internet and the fast dissemination of illegal content online, cooperation between service providers and national competent authorities should be improved;, as well as cross-border cooperation between national competent authorities should be improved; stresses in this regard the need to respect the legal order of the EU and the established principles of cross-border cooperation; stresses that competent authorities have to be provided with adequate resources in order to be effective.
Amendment 120 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Acknowledges the fact that, while the decision on the illegal nature of certain types of content can be easily established, the decision is more difficult for other types of content as it requires contextualisation; warns that some automated tools are not sophisticated enough to take contextuaonline information, products and services is difficult as it requires contextualisation; warns that automated tools are unable to differentiate illegal content from content that is legal in a given context , which could lead to unnecessary restrictions being placed on the freedom of expression; highlights that a review of automated reports by service providers, their staff or their contractors does not solve this problem as private staff lack the independence, qualification and accountability of public authorities; therefore stresses that the Digital Services Act shall explicitly prohibit any oblisgation into account, which could lead to unnecessary restrictions being placed on the freedom of expressionon hosting service providers or other technical intermediaries to use automated tools for content moderation, and refrain from imposing notice-and-stay-down mechanisms; content moderation procedures used by providers shall not lead to any ex-ante control measures based on automated tools or upload- filtering of content;
Amendment 129 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Underlines that a specific piece of information may be deemed illegal in one Member State but is covered by the right to freedom of expression in another; stresses, therefore, that national authorities should only be allowed to address and enforce removal orders to service providers established in their territory;
Amendment 133 #
7. Strongly believes that the current EU legal framework governing digital services should be updated with a view to addressing the challenges posed by new technologies and ensuring legal clarity and, respect for fundamental rights, and enhanced consumer protection; considers that the reform should build on the solid foundation of and full compliance with existing EU law, especially the General Data Protection Regulation and the Directive on privacy and electronic communications; calls on the Council to swiftly reach a general approach which does not lower current levels of protection for consumers to start trilogue negotiations with the European Parliament on the proposal for the ePrivacy Regulation as soon as possible
Amendment 141 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Deems it indispensable to have the widest-possible harmonisation and clarification of rules on liability exemptions and content moderation at EU level to guarantee the respect of fundamental rights and the freedoms of users across the EU; believes that such rules should maintain liability exemptions for intermediaries not having knowledge of the illegal activity or information on their platforms; expresses its concern that recent national laws to tackle hate speech and disinformation lead to a fragmentation of rules and to a lower level of fundamental rights protection in the EU;
Amendment 144 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by requiringstrengthening the rules on competition with regard to digital service providers to prevent harm to competition and consumers; requests for the Digital Services Act to require digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to removtackle illegal contentactivities in line with European valueslaw; firmly believes that this should be harmonised within the digital single market;
Amendment 153 #
Motion for a resolution
Paragraph 10
Paragraph 10
10. Believes, in this regard, that large online platforms that are actively hosting or, moderating contentor recommending content, services or products, should bear more, yet proportionate, responsiliability for the infrastructure they provide and the content on it; emphasises that this should be achieved without resorting to general monitoring requirements; services they offer to users; considers, in this sense, that online marketplaces must be liable upon obtaining credible evidence of illegal activities; emphasises that this should be achieved without resorting to general monitoring requirements, nor to a general and undefined duty of care; Highlights that, in order to ensure legal certainty, the new legal framework shall exhaustively and explicitly spell out the obligations of digital service providers;
Amendment 159 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Highlights that this should includeUrges the adoption of rules on theransparent notice-and-action mechanisms and requirements for platforms to take proactive measures that are proportionate to their scale of reach and operational capacities in order to address the appearance of illegal content on their services; supports a balanced duty-of-care approach andmeasures in order to address the appearance of illegal activities on their services; these measures should include a robust business user authentication and verification process for services and products offered or facilitated in their platforms, while preserving consumer anonymity; stresses that independent public authorities should be ultimately responsible to determine whether an activity is legal or not; supports a clear chain of responsibility to avoid unnecessary regulatory burdens for the platforms and unnecessary and disproportionate restrictions on fundamental rights, including the freedom of expression;
Amendment 171 #
Motion for a resolution
Paragraph 12
Paragraph 12
12. Stresses the need for appropriate safeguards and due process obligations, including human oversight and verification, in addition to counter notice procedures, to ensure that removal or blocking decisions are accurate, well- founded, protect consumers and respect fundamental rights; recalls that the possibility of effective judicial redress should be made available to satisfy the right to effective remedy;
Amendment 172 #
Motion for a resolution
Paragraph 12 a (new)
Paragraph 12 a (new)
12a. Stresses that, in order to protect the freedom of expression and information, it is crucial to maintain the limited liability regime for intermediaries not having knowledge of the illegal activity or information; highlights that the legal regime for digital providers liability should not depend on uncertain notions such as the ‘active’ or ‘passive’ role of providers;
Amendment 174 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Supports limited liability for content and the country of origin principle, butthe country of origin principle including its consumer contracts derogation, but clarifications to the liability regime, particularly for online marketplaces, is needed; considers improved coordination for removal requests between national competent authorities to be essentialimportant; emphasises that such orders should be subject to legal safeguards in order to prevent abuse and ensure full respect of fundamental rights; stresses that an effective oversight and enforcement mechanism, including sanctions, should apply to those service providers that fail to comply with legitimate orderstransparency obligations, judicial orders, and other provisions of the Digital Services Act;
Amendment 180 #
Motion for a resolution
Paragraph 13 a (new)
Paragraph 13 a (new)
13a. Stresses that the responsibility for enforcing the law, deciding on the legality of online activities and ordering hosting service providers to remove or disable access to content as soon as possible shall rest with independent judicial authorities; only a hosting service provider that has actual knowledge of illegal content and its illegal nature shall be subject to content removal obligations
Amendment 191 #
Motion for a resolution
Paragraph 14
Paragraph 14
14. Believes that terms of services of din order to protect consumers' fundamental rights and interests, the Digital sService providers should be clear, transparent and fair; deplores the fact that some terms of services from content platforms do not allow law enforcement to use non-personal accounts, which poses a threat both to possible investigations and to personal safetys Act should introduce rules aiming to ensure that terms of services of digital service providers be clear, transparent and fair;
Amendment 193 #
Motion for a resolution
Paragraph 14 a (new)
Paragraph 14 a (new)
14a. Stresses that in line with the principle of data minimisation established by the General Data Protection Regulation, the Digital Services Act shall require intermediaries to enable the anonymous use of their services and payment for them wherever it is technically possible, as anonymity effectively prevents unauthorized disclosure, identity theft and other forms of abuse of personal data collected online; only where existing legislation requires businesses to communicate their identity, providers of major market places could be obliged to verify their identity, while in other cases the right to use digital services anonymously shall be upheld
Amendment 196 #
Motion for a resolution
Paragraph 15
Paragraph 15
15. Underlines that certain types of legal, yet harmful, content should also be addressed to ensure a fair digital ecosystem; eExpects guidelines to include increased transparency rules on content moderation or political advertising policy to ensure that removals and the blocking of harmful content are limited to the absolute necessarylegal content is not removed;
Amendment 216 #
Motion for a resolution
Paragraph 17
Paragraph 17
17. Calls, in this regard, for a regular, comprehensive and consistent public reporting obligation for platforms, proportionate to their scale of reach and operational capacities, including inter alia information on adopted measures against illegal activities online, number of removed illegal material, number and outcome of internal complaints and judicial remedy;
Amendment 222 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Calls, moreover, for a regular public reporting obligation for national authorities, including inter alia information on the number of removal orders, on the number of identified illegal content or activities which led to investigation and prosecution, and the number of cases of content or activities wrongly identified as illegal;
Amendment 235 #
Motion for a resolution
Paragraph 20
Paragraph 20
20. Supports the creation of an independent EU bodyenforcement mechanism coordinated at EU level, with clear allocation of responsibilities and necessary enforcement tools to exercise effective oversight of compliance with the applicable rules; believes that it should enforce procedural safeguards and transparency and provide quick and reliable guidance on contexts in which legal content is to be considered harmful;
Amendment 239 #
Motion for a resolution
Paragraph 21
Paragraph 21
21. Considers that the transparency reports drawn up by platforms and national competent authorities should be made available to this EU bodyenforcement mechanism, which should be tasked with drawing up yearly reports that provide a structured analysis of illegal content removal and blocking at EU level;
Amendment 247 #
Motion for a resolution
Paragraph 22
Paragraph 22
22. Stresses that this EU bodyenforcement mechanism should not take on the role of content moderator, but that it should analyse, upon complaint or on its own initiative, whether and how digital service providers amplify illegal content; calls for this regulator to have the power to impose proportionate fines or other corrective actions when platforms do not provide sufficient information on their procedures or algorithms in a timely manner;
Amendment 248 #
Motion for a resolution
Paragraph 22 a (new)
Paragraph 22 a (new)
22a. Is concerned that the increased use of automated decision making and machine learning for purposes such as identification, prediction of behaviour or targeted advertising leads to exacerbated direct and indirect discrimination based on grounds such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation when using digital services; insists that the Digital Services Act must aim to ensure a high level of transparency as regards the functioning of online services and a digital environment free of discrimination;
Amendment 254 #
Motion for a resolution
Paragraph 23
Paragraph 23
23. Underlines the importance of empowering users to enforce their own fundamental rights online, including by means of easily accessibltransparency obligations for online services and easily accessible, impartial, efficient and free complaint procedures, legal remedies, educational measures and awareness-raising on data protection issues;