21 Amendments of Anne-Sophie PELLETIER related to 2020/2022(INI)
Amendment 19 #
Motion for a resolution
Recital B
Recital B
B. whereas the data protection rules applicable to all providers offering digital services in the EU’s territory were recently updated and harmonised across the EU with the General Data Protection Regulation, its enforcement needs to be strengthened;
Amendment 35 #
Motion for a resolution
Recital D
Recital D
D. whereas a small number of mostly non-European service providers have significant market power and exert influence over suppliers and control how information, services and products are presented, thereby having an impact on the rights and freedoms of individuals, and our societies and democracies;
Amendment 38 #
Motion for a resolution
Recital E
Recital E
E. whereas the politicalcy approach to tackle harmful and illegal contentactivities online in the EU has mainly focused on voluntary cooperation thus far, but a growing number of Member States are adopting national legislation to address illegal content;
Amendment 61 #
Motion for a resolution
Recital J b (new)
Recital J b (new)
Jb. whereas algorithms used for automated decision-making or profiling often reproduce existing discriminatory patterns in society, thereby leading to a high risk of exacerbated discrimination for persons already affected.
Amendment 70 #
Motion for a resolution
Recital L
Recital L
L. whereas according to the Court of Justice of the European Union (CJEU), jurisprudence, host providers may have recourse to automated search tools and technologies to assess if content is equivalent to content previously declared unlawful, and should thus be removed following an order from a Member State, but they are not obliged to use such automated tools;
Amendment 80 #
Motion for a resolution
Paragraph 1
Paragraph 1
Amendment 96 #
Motion for a resolution
Paragraph 2
Paragraph 2
2. Believes in the clear economic benefits of a functioning digital single market for the EU and its Member States; stresses the important obligation to ensure a fair digital ecosystem in which fundamental rights and, including data protection, are respected; calls for a minimum level ofcomprehensive and effective regulatory intervention based on the principles of necessity and proportionality;
Amendment 107 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Deems it necessary that illegal content isactivities are removed swiftly and consistently in order to address crimelaw infringements and fundamental rights violations; considers that voluntary codes of conduct only partiallylack adequate enforcement and have proven to be inefficient in addressing the issue;
Amendment 111 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Recalls that illegal contentinformation, services and products online should not only be removed by online platforms, but should be followed up by law enforcement and the judiciary; finds, in this regard, that a key issue in some Member States is not that they have unresolved cases but rather unopened onescalls on the Commission to consider obliging major hosting service providers to report serious crime to the competent law enforcement authority, upon obtaining actual knowledge of such a crime; calls for barriers to filing complaints with competent authorities to be removed; is convinced that, given the borderless nature of the internet and the fast dissemination of illegal content online, cooperation between service providers and national competent authorities should be improved;, as well as cross-border cooperation between national competent authorities should be improved; stresses in this regard the need to respect the legal order of the EU and the established principles of cross-border cooperation; stresses that competent authorities have to be provided with adequate resources in order to be effective.
Amendment 120 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Acknowledges the fact that, while the decision on the illegal nature of certain types of content can be easily established, the decision is more difficult for other types of content as it requires contextualisation; warns that some automated tools are not sophisticated enough to take contextuaonline information, products and services is difficult as it requires contextualisation; warns that automated tools are unable to differentiate illegal content from content that is legal in a given context , which could lead to unnecessary restrictions being placed on the freedom of expression; highlights that a review of automated reports by service providers, their staff or their contractors does not solve this problem as private staff lack the independence, qualification and accountability of public authorities; therefore stresses that the Digital Services Act shall explicitly prohibit any oblisgation into account, which could lead to unnecessary restrictions being placed on the freedom of expressionon hosting service providers or other technical intermediaries to use automated tools for content moderation, and refrain from imposing notice-and-stay-down mechanisms; content moderation procedures used by providers shall not lead to any ex-ante control measures based on automated tools or upload- filtering of content;
Amendment 133 #
7. Strongly believes that the current EU legal framework governing digital services should be updated with a view to addressing the challenges posed by new technologies and ensuring legal clarity and, respect for fundamental rights, and enhanced consumer protection; considers that the reform should build on the solid foundation of and full compliance with existing EU law, especially the General Data Protection Regulation and the Directive on privacy and electronic communications; calls on the Council to swiftly reach a general approach which does not lower current levels of protection for consumers to start trilogue negotiations with the European Parliament on the proposal for the ePrivacy Regulation as soon as possible
Amendment 141 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Deems it indispensable to have the widest-possible harmonisation and clarification of rules on liability exemptions and content moderation at EU level to guarantee the respect of fundamental rights and the freedoms of users across the EU; believes that such rules should maintain liability exemptions for intermediaries not having knowledge of the illegal activity or information on their platforms; expresses its concern that recent national laws to tackle hate speech and disinformation lead to a fragmentation of rules and to a lower level of fundamental rights protection in the EU;
Amendment 144 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by requiringstrengthening the rules on competition with regard to digital service providers to prevent harm to competition and consumers; requests for the Digital Services Act to require digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to removtackle illegal contentactivities in line with European valueslaw; firmly believes that this should be harmonised within the digital single market;
Amendment 159 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Highlights that this should includeUrges the adoption of rules on theransparent notice-and-action mechanisms and requirements for platforms to take proactive measures that are proportionate to their scale of reach and operational capacities in order to address the appearance of illegal content on their services; supports a balanced duty-of-care approach andmeasures in order to address the appearance of illegal activities on their services; these measures should include a robust business user authentication and verification process for services and products offered or facilitated in their platforms, while preserving consumer anonymity; stresses that independent public authorities should be ultimately responsible to determine whether an activity is legal or not; supports a clear chain of responsibility to avoid unnecessary regulatory burdens for the platforms and unnecessary and disproportionate restrictions on fundamental rights, including the freedom of expression;
Amendment 171 #
Motion for a resolution
Paragraph 12
Paragraph 12
12. Stresses the need for appropriate safeguards and due process obligations, including human oversight and verification, in addition to counter notice procedures, to ensure that removal or blocking decisions are accurate, well- founded, protect consumers and respect fundamental rights; recalls that the possibility of effective judicial redress should be made available to satisfy the right to effective remedy;
Amendment 174 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Supports limited liability for content and the country of origin principle, butthe country of origin principle including its consumer contracts derogation, but clarifications to the liability regime, particularly for online marketplaces, is needed; considers improved coordination for removal requests between national competent authorities to be essentialimportant; emphasises that such orders should be subject to legal safeguards in order to prevent abuse and ensure full respect of fundamental rights; stresses that an effective oversight and enforcement mechanism, including sanctions, should apply to those service providers that fail to comply with legitimate orderstransparency obligations, judicial orders, and other provisions of the Digital Services Act;
Amendment 180 #
Motion for a resolution
Paragraph 13 a (new)
Paragraph 13 a (new)
13a. Stresses that the responsibility for enforcing the law, deciding on the legality of online activities and ordering hosting service providers to remove or disable access to content as soon as possible shall rest with independent judicial authorities; only a hosting service provider that has actual knowledge of illegal content and its illegal nature shall be subject to content removal obligations
Amendment 216 #
Motion for a resolution
Paragraph 17
Paragraph 17
17. Calls, in this regard, for a regular, comprehensive and consistent public reporting obligation for platforms, proportionate to their scale of reach and operational capacities, including inter alia information on adopted measures against illegal activities online, number of removed illegal material, number and outcome of internal complaints and judicial remedy;
Amendment 222 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Calls, moreover, for a regular public reporting obligation for national authorities, including inter alia information on the number of removal orders, on the number of identified illegal content or activities which led to investigation and prosecution, and the number of cases of content or activities wrongly identified as illegal;
Amendment 248 #
Motion for a resolution
Paragraph 22 a (new)
Paragraph 22 a (new)
22a. Is concerned that the increased use of automated decision making and machine learning for purposes such as identification, prediction of behaviour or targeted advertising leads to exacerbated direct and indirect discrimination based on grounds such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation when using digital services; insists that the Digital Services Act must aim to ensure a high level of transparency as regards the functioning of online services and a digital environment free of discrimination;
Amendment 254 #
Motion for a resolution
Paragraph 23
Paragraph 23
23. Underlines the importance of empowering users to enforce their own fundamental rights online, including by means of easily accessibltransparency obligations for online services and easily accessible, impartial, efficient and free complaint procedures, legal remedies, educational measures and awareness-raising on data protection issues;