20 Amendments of Beata KEMPA related to 2020/2022(INI)
Amendment 33 #
Motion for a resolution
Recital D
Recital D
D. whereas a small number of mostly non-European service providers have significantmonopoly over the market power and exert influence on the rights and freedoms of individuals, our societies and democracies, thus also giving them enormous influence on the functioning of all Community countries and their citizens;
Amendment 43 #
Motion for a resolution
Recital F
Recital F
F. whereas some forms of harmful content may be legal, yet detrimental to society or democracy, with examples such as opaque political advertising and disinformation on COVID-19 causes and remedies;
Amendment 49 #
Motion for a resolution
Recital G
Recital G
Amendment 53 #
Motion for a resolution
Recital I
Recital I
Amendment 57 #
Motion for a resolution
Recital J
Recital J
J. whereas the lack of robust public data on the prevalence and removal of illegal and harmful content online, as well as the lack of proper transparency from internet platforms and services as to the algorithms they use, creates a deficit of accountability;
Amendment 69 #
Motion for a resolution
Recital L
Recital L
L. whereas according to the Court of Justice of the European Union (CJEU), jurisprudence host providers may have recourse to automated search tools and technologies to assess if content is equivalent to content previously declared unlawful, and should thus be removed following an order from a Member Stateutomated search tools and technologies to assess if content is equivalent to content previously declared unlawful are unreliable and do not provide adequate protection for freedom of expression and civil liberties online; whereas any attempt to proactively filter content should be limited, and any automatic deletion of content must always be carried out under human supervision and action;
Amendment 72 #
Motion for a resolution
Recital L a (new)
Recital L a (new)
La. whereas the internet and internet platforms are still a key location for terrorist groups’ activities, and they are used as a tool for sowing propaganda, recruitment and promotion of their activities;
Amendment 106 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. Deems it necessary that flagrantly illegal content is removed swiftly and consistently in order to address crimes and fundamental rights violationsterrorist propaganda; considers that voluntary codes of conduct only partially address the issuehave helped to reduce the appearance of illegal content on the internet and are a good mechanism that should be strengthened;
Amendment 115 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Recalls that illegal content online should not only be removed by online platforms, but should be followed up by law enforcement and the judiciary; finds, in this regard, that a key issue in some Member States is not that they have unresolved cases but rather unopened ones; calls for barriers to filing complaints with competent authorities to be removed; is convinced that, given the borderless nature of the internet and the fast dissemination of illegal content online, cooperation between service providers and national competent authorities should be improvedis convinced that, given the borderless nature of the internet and the fast dissemination of illegal content online, cooperation between service providers and national competent authorities, as well as between national competent authorities, should be improved, for instance by introducing tools based on cooperation and mutual trust between Member States, e.g. beyond the cross-border order to remove online content which is clearly and unquestionably illegal;
Amendment 124 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Acknowledges the fact that, while the illegal nature of certain types of content can be easily established, the decision is more difficult for other types of content as it requires contextualisation; warns that some automated tools are not sophisticated enough to take contextualisation into account, which could lead to unnecessary and harmful restrictions being placed on the freedom of expression, political views and the right to receive a variety of often controversial information, leading to the filtering and censorship of the internet;
Amendment 146 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by requiring digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to remove illegal content in line with European valuesthe values that derive from the Roman civilisation and the Christian ethics that underpin the existence of the European Community; firmly believes that this should be harmonised within the digital single market;
Amendment 161 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Highlights that this should include rules on the notice-and-action mechanisms and requirements for platforms to take proactive measures that are proportionate to their scale of reach and technical and operational capacities in order to address the appearance of illegal content on their services; supports a balanced duty-of-care approach and a clear chain of responsibility to avoid unnecessary regulatory burdens for the platforms and unnecessary and disproportionate restrictions on fundamental rights, including the freedom of expressionthe freedom to controversial and polemical expression, as well as to restrict the promotion of various philosophical, social and political ideas;
Amendment 169 #
Motion for a resolution
Paragraph 12
Paragraph 12
12. Stresses the need for appropriate safeguards and due process obligations, including human oversight and verification, in addition to counter notice procedures, to ensure that removal or blocking decisions are accurate, well- founded and respect fundamental rights; recalls that the possibility of judicial redress, following the final decision taken by the platforms in accordance with the internal complaints system, should be made available to satisfy the right to effective remedy;
Amendment 177 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Supports limited liability for content and the country of origin principle, but considers improved coordination for removal requests between national competent authorities to be essential; emphasises that such orders should be subject to legal safeguards in order to prevent abuse and ensure full respect of fundamental rights and civil rights and freedoms; stresses that proportionate sanctions should apply to those service providers that fail to comply with legitimate orders even though they possess the technical and operational capacities;
Amendment 194 #
Motion for a resolution
Paragraph 15
Paragraph 15
15. Underlines that certain types of legal, yet harmful, content should also be addressed to ensure a fair digital ecosystem; expects guidelines to include increased transparency rules on contentany attempt to regulate or moderation ore political advertising policy to ensure that removals and the blocking of harmful content are limited to the absolute necessaryshould be prohibited;
Amendment 217 #
Motion for a resolution
Paragraph 17
Paragraph 17
17. Calls, in this regard, for a regular annual public reporting obligation for platforms, proportionate to their scale of reach and operational capacities; stresses that such reports, covering actions taken in the year preceding the year of submission, should be submitted by the end of the first quarter of that year;
Amendment 225 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Calls, moreover, for a regular annual public reporting obligation for national authorities;
Amendment 229 #
Motion for a resolution
Paragraph 20
Paragraph 20
Amendment 237 #
Motion for a resolution
Paragraph 21
Paragraph 21
21. Considers that the transparency reports drawn up by platforms and national competent authorities should be made available to thise EU bodyies, which should be tasked with drawing up yearly reports that provide a structured analysis of illegal content removal and blocking at EU level; stresses that these reports should be published annually on the public services of the EU institutions;
Amendment 246 #
Motion for a resolution
Paragraph 22
Paragraph 22
22. Stresses that thise EU bodyinstitutions should not take on the role of content moderator, but that ithey should analyse, upon complaint or on its own initiative, whether and how digital service providers amplify illegal content; calls for this regulatore EU institutions to have the power to impose proportionate fines or otherndicate corrective actions when platforms do not provide sufficient information on their procedures or algorithms in a timely manner;