Progress: Procedure completed
Role | Committee | Rapporteur | Shadows |
---|---|---|---|
Lead | LIBE | PEETERS Kris ( EPP) | KALJURAND Marina ( S&D), KÖRNER Moritz ( Renew), BREYER Patrick ( Verts/ALE), JAKI Patryk ( ECR), ERNST Cornelia ( GUE/NGL) |
Committee Opinion | IMCO | BIELAN Adam ( ECR) | Dita CHARANZOVÁ ( RE), Deirdre CLUNE ( PPE), Adriana MALDONADO LÓPEZ ( S&D) |
Committee Opinion | CULT | KAMMEREVERT Petra ( S&D) | Niklas NIENASS ( Verts/ALE), Alexis GEORGOULIS ( GUE/NGL), Tomasz FRANKOWSKI ( PPE), Dace MELBĀRDE ( ECR) |
Lead committee dossier:
Legal Basis:
RoP 54
Legal Basis:
RoP 54Subjects
Events
The European Parliament adopted by 566 votes to 45, with 80 abstentions, a resolution on the Digital Services Act and fundamental rights issues posed.
Members stressed that fundamental rights, such as the protection of privacy and personal data, the principle of non-discrimination, as well as freedom of expression and information, need to be ingrained at the core of a successful and durable EU policy on digital services.
Tailored approach
Parliament urged the Commission to adopt a tailored regulatory approach in addressing the differences that still persist between online and offline worlds and the challenges raised by the diversity of actors and services offered online. It considered it essential to apply different regulatory approaches to illegal and legal content. Illegal content online and cyber-enabled crimes should be tackled with the same rigour and on the basis of the same legal principles as illegal content and criminal behaviour offline, and with the same guarantees for citizens.
Illegal content
The resolution deemed it necessary that illegal content be removed swiftly and consistently to address crimes and fundamental rights violations. Content removal should be ‘diligent, proportionate and non-discriminatory’ to safeguard freedom of expression and information and privacy. Moreover, any content removal measures legally imposed by digital services legislation should only apply to illegal content as defined in European or national legislation.
Members called on the Commission to consider obliging online platforms to report serious crime to the competent authority when they have received knowledge of such a crime. They called for the systematic and immediate removal of illegal content in order to address infringements, notably those relating to children and terrorist content, and fundamental rights violations.
Illegal content online should not only be removed by online platforms, but should also be followed up by law enforcement and the judiciary where criminal acts are concerned. Special attention should be paid to harmful content in the context of minors using the internet, especially as regards to their exposure to cyberbullying, sexual harassment, pornography, violence and self-harm.
Spreading harmful content
Parliament called for action to combat problematic behaviour such as micro-targeting based on citizens' vulnerabilities, misleading advertising, the spread of hate speech and disinformation, the presence of algorithms creating false profiles or manipulating online content, and political profiling to manipulate voting behaviour.
Members called for transparency in monetisation policies of online platforms and suggested that steps be taken to detect and report content posted by social bots on social networks.
The resolution welcomed the Commission's initiative to set up a European Digital Media Observatory to support independent fact-checking services, increase public knowledge about online disinformation and support public authorities responsible for monitoring digital media.
Improved cooperation
Given the borderless nature of the internet and the fast dissemination of illegal content online, Members considered that cooperation between service providers and national competent authorities, as well as cross-border cooperation between national competent authorities, should be improved and based on the principles of necessity and proportionality.
Harmonisation on liability
Members deemed it indispensable to have the full harmonisation and clarification of rules on liability at EU level to guarantee the respect of fundamental rights and the freedoms of users across the EU.
Legislative proposals should be proposed that keep the digital single market open and competitive by providing harmonised requirements for digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to address illegal content in line with national and European law, including via a harmonised notice-and-action procedure.
To guarantee proper enforcement of the Digital Services Act, the oversight of compliance with procedures, procedural safeguards and transparency obligations laid down in this act should be harmonised within the digital single market. A strong and rigorous enforcement by an independent EU oversight structure is supported.
The Committee on Civil Liberties, Justice and Home Affairs adopted the own-initiative report by (Kris PEETERS, EPP, BE) on the Digital Services Act and fundamental rights issues posed.
Members stressed that fundamental rights, such as the protection of privacy and personal data, the principle of non-discrimination, as well as freedom of expression and information, need
to be ingrained at the core of a successful and durable EU policy on digital services. The types of digital services and the roles of digital service providers have drastically changed since the adoption of the e-Commerce Directive 20 years ago.
Data protection rules applicable to all providers offering digital services in the EU’s territory, on the other hand, were recently updated and harmonised across the EU with the General Data Protection Regulation.
Tailored approach
In this regard, Members urged the Commission to adopt a tailored regulatory approach in order to address the differences that still persist between online and offline worlds and the challenges raised by the diversity of actors and services offered online. The considered it essential to apply different regulatory approaches to illegal and legal content. Illegal content online and cyber-enabled crimes should be tackled with the same rigour and on the basis of the same legal principles as illegal content and criminal behaviour offline, and with the same guarantees for citizens.
Illegal content
The report deemed it necessary that illegal content be removed swiftly and consistently in order to address crimes and fundamental rights violations. Illegal content online should not only be removed by online platforms, but should also be followed up by law enforcement and the judiciary where criminal acts are concerned.
Special attention should be paid to harmful content in the context of minors using the internet, especially as regards to their exposure to cyberbullying, sexual harassment, pornography, violence and self-harm.
Members called on the Commission to consider obliging online platforms to report serious crime to the competent authority when they have received knowledge of such a crime.
Improved cooperation
Given the borderless nature of the internet and the fast dissemination of illegal content online, Members considered that cooperation between service providers and national competent authorities, as well as cross-border cooperation between national competent authorities, should be improved and based on the principles of necessity and proportionality.
Member States are called on to equip their law enforcement and judicial authorities with the necessary expertise, resources and tools to allow them to effectively and efficiently deal with the increasing number of cases involving illegal content online and with dispute resolution concerning the taking offline of content, and to improve access to justice in the area of digital services.
The report highlighted the fact that a specific piece of content may be deemed illegal in one Member State but is covered by the right to freedom of expression in another. Members suggested that the current EU legal framework governing digital services should be updated with a view to addressing the challenges posed by the fragmentation between the Member States and new technologies, as well as ensuring legal clarity and respect for fundamental rights, in particular the freedom of expression.
Harmonisation on liability
Members deemed it indispensable to have the full harmonisation and clarification of rules on liability at EU level to guarantee the respect of fundamental rights and the freedoms of users across the EU. Legislative proposals should be proposed that keep the digital single market open and competitive by providing harmonised requirements for digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to address illegal content in line with national and European law, including via a harmonised notice-and-action procedure.
In order to guarantee proper enforcement of the Digital Services Act, the oversight of compliance with procedures, procedural safeguards and transparency obligations laid down in this act should be harmonised within the digital single market. A strong and rigorous enforcement by an independent EU oversight structure is supported.
Documents
- Results of vote in Parliament: Results of vote in Parliament
- Decision by Parliament: T9-0274/2020
- Debate in Parliament: Debate in Parliament
- Committee report tabled for plenary: A9-0172/2020
- Committee opinion: PE648.588
- Committee opinion: PE648.599
- Amendments tabled in committee: PE653.762
- Committee draft report: PE650.509
- Committee draft report: PE650.509
- Amendments tabled in committee: PE653.762
- Committee opinion: PE648.599
- Committee opinion: PE648.588
Activities
- Marcel KOLAJA
Plenary Speeches (2)
- 2020/10/19 Digital Services Act: Improving the functioning of the Single Market - Digital Services Act: adapting commercial and civil law rules for commercial entities operating online - Digital Services Act and fundamental rights issues posed - Framework of ethical aspects of artificial intelligence, robotics and related technologies - Civil liability regime for artificial intelligence - Intellectual property rights for the development of artificial intelligence technologies (debate)
- 2020/10/19 Digital Services Act: Improving the functioning of the Single Market - Digital Services Act: adapting commercial and civil law rules for commercial entities operating online - Digital Services Act and fundamental rights issues posed - Framework of ethical aspects of artificial intelligence, robotics and related technologies - Civil liability regime for artificial intelligence - Intellectual property rights for the development of artificial intelligence technologies (continuation of debate)
- Kris PEETERS
Plenary Speeches (2)
- 2020/10/19 Digital Services Act: Improving the functioning of the Single Market - Digital Services Act: adapting commercial and civil law rules for commercial entities operating online - Digital Services Act and fundamental rights issues posed - Framework of ethical aspects of artificial intelligence, robotics and related technologies - Civil liability regime for artificial intelligence - Intellectual property rights for the development of artificial intelligence technologies (debate)
- 2020/10/19 Digital Services Act: Improving the functioning of the Single Market - Digital Services Act: adapting commercial and civil law rules for commercial entities operating online - Digital Services Act and fundamental rights issues posed - Framework of ethical aspects of artificial intelligence, robotics and related technologies - Civil liability regime for artificial intelligence - Intellectual property rights for the development of artificial intelligence technologies (continuation of debate)
- Andrus ANSIP
- Izaskun BILBAO BARANDICA
- Dita CHARANZOVÁ
- Andor DELI
- Geoffroy DIDIER
- Evelyne GEBHARDT
- Sylvie GUILLAUME
- Eva KAILI
- Petra KAMMEREVERT
- Gilles LEBRETON
- Antonius MANDERS
- Emmanuel MAUREL
- Cláudia MONTEIRO DE AGUIAR
- Sirpa PIETIKÄINEN
- Ivan ŠTEFANEC
- József SZÁJER
- Carlos ZORRINHO
- Josianne CUTAJAR
- Sandra PEREIRA
- Gunnar BECK
- Gwendoline DELBOS-CORFIELD
- Maximilian KRAH
- Stéphane SÉJOURNÉ
- Karen MELCHIOR
- Robert ROOS
- Edina TÓTH
- Sabrina PIGNEDOLI
- Eugen JURZYCA
- Adriana MALDONADO LÓPEZ
- Tomasz FRANKOWSKI
- Beata MAZUREK
- Dace MELBĀRDE
- Liesje SCHREINEMACHER
- Alessandra BASSO
- Geert BOURGEOIS
- Patrick BREYER
- Maria-Manuel LEITÃO-MARQUES
- Clara PONSATÍ OBIOLS
- Jean-Lin LACAPELLE
Votes
A9-0172/2020 - Kris Peeters - Résolution #
Amendments | Dossier |
397 |
2020/2022(INI)
2020/04/28
CULT
51 amendments...
Amendment 1 #
Draft opinion Paragraph 1 1. Points out that fundamental rights constitute an objective system of values which ensures that fundamental communication freedoms are not alterable, including by private-law agreements or business terms and conditions; points out that consumer protection, user safety, the option of online anonymity and freedom of speech must be at the core of protecting the fundamental rights; stresses the importance of helping consumers and users take greater control of and responsibility for their own data and identity;
Amendment 10 #
Draft opinion Paragraph 1 a (new) 1a. Points out that freedom of the arts and sciences are not alterable and therefore must not be compromised by any new legislation;
Amendment 11 #
Draft opinion Paragraph 1 b (new) 1b. Emphasises that content, which is legal and legally shared under Union or national law, has to stay online and that any removal of such content must not lead to the identification of individual users nor to the processing of personal data;
Amendment 12 #
1b. Points out that media ecosystem suffers from disruptive effects of online platforms; emphasises that public authorities have a positive obligation to adopt a legal framework, which fosters the development of independent and pluralistic media;
Amendment 13 #
Draft opinion Paragraph 2 2. Calls for all proactive protective measures which might at the same time be detrimental to fundamental rights to remain tasks for the State that are subject to thorough judicial review and for no public- authority tasks to be transferred to private- sector firms; stresses the need to comply with the overarching legislative framework, to be subjected to judicial oversight, and not merely left to the discretion of private sector firms; calls for establishing a clear EU-wide framework for content blocking with clear constraints, while promoting transparency on what content is blocked and why; calls for balanced solutions regarding content removals with cooperation between platforms, regulative authorities, fact- checkers and users; stresses that sharing GDPR-compliant data on illegal activity with the law enforcement and other authorities should be a priority for platforms in addition to their own effective and appropriate safeguards;
Amendment 14 #
Draft opinion Paragraph 2 2. Calls for all protective measures which might at the same time be detrimental to fundamental rights to remain tasks for the State that are subject to thorough judicial review and for no public- authority tasks to be transferred to private- sector firms; considers that these sector- specific rules may ensure unhindered access to media services and content, as well as advance freedom and pluralism of the media;
Amendment 15 #
Draft opinion Paragraph 2 2. Calls for all protective measures
Amendment 16 #
Draft opinion Paragraph 2 2. Calls for all protective measures
Amendment 17 #
Draft opinion Paragraph 2 2. Calls for all protective measures which might at the same time be detrimental to fundamental rights to remain tasks for the State that are subject to thorough judicial review and for no public- authority tasks to be transferred to private- sector firms or individuals;
Amendment 18 #
Draft opinion Paragraph 2 2. Calls for all
Amendment 19 #
Draft opinion Paragraph 2 a (new) 2a. Calls on the Commission to ensure that transparency reports are made available by platform operators, which contain information about the number of cases where content was misidentified as illegal or as illegally shared and that competent authorities should make available information about the number of cases where removals lead to the investigation and the prosecution of crimes;
Amendment 2 #
Draft opinion Paragraph 1 1. Points out that fundamental rights constitute an objective system of values which ensures that fundamental communication freedoms
Amendment 20 #
Draft opinion Paragraph 2 a (new) 2a. Points out that some harmful content or partially accurate information may not necessarily be illegal; notes that automatic filtering tools may lead to filtering out of legal content; considers it necessary to ensure that content owners can defend their rights to a sufficient extent, when their content has been removed;
Amendment 21 #
Draft opinion Paragraph 2 a (new) 2a. Stresses that any monitoring of their content applied by online platform and other services should be submitted to rigorous and transparent standards, known by their users, and enable an effective right to appeal decisions, first to the online platform or service, but also to a public authority;
Amendment 22 #
Draft opinion Paragraph 2 a (new) 2a. Suggests that special attention should be paid to the protection of children and young people and that this protection should also be safeguarded under data protection law and calls for online services for the protection of children and young people to be subject to the highest data protection restrictions;
Amendment 23 #
Draft opinion Paragraph 2 a (new) 2a. Reiterates the work of the Human Rights Committees of the United Nations in further elaborating interpretatively the texts of the UN Human Rights Conventions, in order be fit for the digital era, along with the work of the Special Procedures of the UN Human Rights Council;
Amendment 24 #
Draft opinion Paragraph 2 b (new) 2b. Points out that soft coordination, support or supplementary measures, such as codes of conduct or self-regulation and co-regulation, may be efficient regulatory means, provided that government agencies monitor their impact and legal provision is made for State regulation where they are proved to be ineffective, since they often allow a swift response to changing circumstances also involving non-EU participants;
Amendment 25 #
Draft opinion Paragraph 2 c (new) 2c. Points out that enforcement is in principle the responsibility of the national regulatory authorities also in cross-border cases and should not be relocated to the European level without good reason; further believes that the idea of the country of origin principle will be strengthened if the national regulatory authorities have effective enforcement tools and efficient cross-border cooperation procedures in place; at European level, this should be flanked by swift and efficient dispute settlement procedures that ensure lasting legal peace;
Amendment 26 #
Draft opinion Paragraph 3 3. Calls for recognition of the fact that services developed in the EU which guarantee effective and comprehensive privacy protection and maximum digital freedom represent an advantage in global competition that should not be underestimated, and calls on the Commission to promote their development in a more targeted manner; calls for European values to be upheld in a safe digital environment, promoting diversity of opinion, net neutrality, freedom of speech and access to information; calls for clear, uniform rules, for more platform and advertising industries to apply their principles on platform liability, illegal or harmful content, algorithmic accountability, transparent advertising and fight against dissemination of disinformation, hate speech and fake or bot accounts, to preserve fundamental people rights and freedom;
Amendment 27 #
Draft opinion Paragraph 3 3.
Amendment 28 #
Draft opinion Paragraph 3 3. Calls for recognition of the fact that those services developed in the EU which guarantee effective and comprehensive privacy protection and maximum digital freedom represent an advantage in global competition that should not be underestimated, and calls on the Commission to promote their development in a more targeted manner via clear and efficient solutions fit for digital age;
Amendment 29 #
Draft opinion Paragraph 3 3. Calls for recognition of the fact that services developed in the EU which guarantee effective and comprehensive privacy and data protection and promotes maximum digital
Amendment 3 #
Draft opinion Paragraph 1 1. Points out that fundamental rights constitute an objective
Amendment 30 #
Draft opinion Paragraph 3 3. Calls for recognition of the fact that services developed in the EU which guarantee effective and comprehensive privacy protection and m
Amendment 31 #
Draft opinion Paragraph 3 a (new) 3a. Ensures that private agreements between artists and companies do not contravene the fundamental rights of the artists, creators and all personnel employed in the creative and cultural sectors, respecting their work and their intellectual property, both financially and ethically; highlights the need to ensure a fair remuneration from all activities entailing creativity for all stakeholders involved in the process, and with due regard to their right to collective bargaining;
Amendment 32 #
Draft opinion Paragraph 3 a (new) 3a. Stresses that any new obligations on platforms should be proportional to their market share and financial capacity, in order to encourage fair competition and promote innovation; believes that such approach would help strengthen information and media plurality and cultural and linguistic diversity;
Amendment 33 #
Draft opinion Paragraph 4 4. Calls for sector-specific rules that serve to realise society-wide objectives and give tangible expression to them for certain sectors, such as the Audiovisual Media Services Directive (AVMSD) and the Copyright Directive, to take precedence over general rules
Amendment 34 #
Draft opinion Paragraph 4 4. Calls for sector-specific rules that serve to realise society-wide objectives and give tangible expression to them for certain sectors, such as the Audiovisual Media Services Directive (AVMSD), to take precedence over general rules
Amendment 35 #
Draft opinion Paragraph 4 4. Calls for sector-specific rules that serve to realise society-wide objectives and give tangible expression to them for certain sectors, such as the Audiovisual Media Services Directive (AVMSD) or the Copyright Directive, to take precedence over general rules.
Amendment 36 #
Draft opinion Paragraph 4 a (new) Amendment 37 #
Draft opinion Paragraph 4 a (new) 4a. Stresses out that any rule on content moderation for service providers must ensure full respect for freedom of expression, which according to Art. 11 CFREU, includes "freedom to hold opinions and to receive and impart Information and ideas without interference by public authorities and regardless of frontiers'', and that access to a wide variety of opinions contributes to the development of open and democratic societies even when such views are controversial or unpalatable;
Amendment 38 #
Draft opinion Paragraph 4 a (new) 4a. Stresses the importance of preventing the deployment of mass surveillance and identification technologies, without fully understanding their impacts on people, freedoms and fundamental rights, and without ensuring that these systems are fully compliant with data protection and privacy law as well as human rights;
Amendment 39 #
Draft opinion Paragraph 4 b (new) 4b. Emphasises that the sharing of personal data and the processing of data for the purposes of the new Digital Single Act must respect the safeguards set out by the GDPR and the rules of the protection of data put in place in the Union; highlights that there is no need for any 'lex specialis' derogation from the General Rules of Data Protection;
Amendment 4 #
Draft opinion Paragraph 1 1. Points out that fundamental rights constitute an objective system of values which ensures that fundamental communication freedoms and property, as well as its protection, are not alterable, including by private-law agreements or business terms and conditions;
Amendment 40 #
Draft opinion Paragraph 4 b (new) 4b. Stresses the need to give citizens more control over how their personal data is managed and protected online, while also placing more responsibility on businesses in their data protection practices;
Amendment 41 #
Draft opinion Paragraph 4 c (new) 4c. Highlights the need to ensure that the collection and processing of all personal data, which does not fall under the scope of Directive (EU) 2016/680 on the Protection of natural persons with regard to processing of personal data by competent authorities for the purpose of law enforcement, or under the scope of the EU General Data Protection Regulation (GDPR), is done in accordance with the principles of legality, necessity and proportionality, as established by Article 9 of the Council of Europe's Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (Convention No. 108);
Amendment 42 #
Draft opinion Paragraph 4 c (new) 4c. Calls on the Commission and the Member States to promote cooperation between the public and private sectors as well as academia, in order to reinforce knowledge sharing, the promotion of safety education and training, data privacy, ethical implications, and respect for human rights, relating to the use of digital technology, robotics and Artificial Intelligence (AI);
Amendment 43 #
Draft opinion Paragraph 4 d (new) 4d. Believes that platform liability should be tailored, in order to respect the size of the operator and a clear distinction on the platforms' engagement with the content based on clear and verifiable criteria and aspects, such as editorial functions, actual knowledge and a certain degree of control; furthermore believes that any system proposed, should be accompanied by robust set of fundamental rights safeguards and adequate independent and impartial public oversight;
Amendment 44 #
Draft opinion Paragraph 4 d (new) 4d. Stresses that, regardless of social benefits provided by new technologies, digital services and data-driven technologies, including Artificial Intelligence (AI), addressing and analysing potential risks to democratic values, the rule of law and fundamental rights must be a top priority;
Amendment 45 #
Draft opinion Paragraph 4 e (new) 4e. Strongly believes that the issue of platform workers must be addressed by a specific, dedicated, labour-oriented piece of legislation and not in an act dealing with digital services; suggests that platform operators should be responsible as employers for platform workers, and thus they should be liable to provide quality work, individual labour rights, social protection training and fulfilling health and safety at work requirements; reiterates that, in case of a provision of a legal presumption that platform workers are workers, it would ensure that requisite responsibility of platform operators as employers, to guarantee labour rights and to contribute to social security for platform workers;
Amendment 46 #
Draft opinion Paragraph 4 e (new) 4e. Stresses that in many cases, fundamental rights in the Union are often under threat, and that they are already being unjustifiably, disproportionately and unlawfully violated in the name of security, public health and public interest; stresses that the principles of necessity and proportionality should always be at the forefront, whenever there is an interference with fundamental rights;
Amendment 47 #
Draft opinion Paragraph 4 f (new) 4f. Highlights the need to find an effective way to enforce properly and appropriately intellectual property rights thus fostering cultural and creative sectors without interfering with freedom of expression; believes that this balance should include proactive measures to be employed when necessary, in order to ensure that illegal and harmful content is not only taken down from online platforms but also that it stays down;
Amendment 48 #
Draft opinion Paragraph 4 f (new) 4f. Stresses the need to update, modify, increase the comprehensiveness, clarity, and transparency of EU and national rules, while at the same time, cutting unnecessary and outdated regulations rather than adding more regulation;
Amendment 49 #
Draft opinion Paragraph 4 g (new) 4g. Recognises the need to define companies' human rights responsibilities in line with the ''respect, protect, remedy'' framework set out under the UN Guiding Principles and Business and Human Rights, which should be promptly turned into legally binding standards of International Law, by requiring in particular a human rights approach to the development of terms of service and Community standards as well as policies governing access to and use of their platform;
Amendment 5 #
Draft opinion Paragraph 1 1. Points out that fundamental rights
Amendment 50 #
Draft opinion Paragraph 4 h (new) 4h. Highlights the need to ensure transparency to the maximum extent feasible to all measures taken under the new Act, including those taken by Member States' and Union authorities;
Amendment 51 #
Draft opinion Paragraph 4 i (new) 4i. Emphasises the need to establish effective and timely remedies accessible to all, without discrimination, that are independent and impartial;
Amendment 6 #
Draft opinion Paragraph 1 a (new) 1a. Calls for the establishment of a modern understanding of fundamental rights, according to which fundamental rights are not only defensive rights against the State, but also protect freedom by limiting power; fundamental rights must therefore also impose obligations on those who exercise power through their technical infrastructures; in the case of such situation-based binding of private players by fundamental rights on a par with the State, account shall be taken of the degree of market domination, a market-dominating or quasi-monopolistic position, the degree of user reliance on the offer and the affected interests of users, of the powerful players themselves and of other third parties;
Amendment 7 #
Draft opinion Paragraph 1 a (new) 1a. Underlines that the Digital Services Act should be fully compliant with the objective of ensuring the protection of fundamental rights, including the right to freedom of expression and information, privacy, as well as the right to property, including intellectual property;
Amendment 8 #
Draft opinion Paragraph 1 a (new) 1a. Considers that any forthcoming legislative proposals on digital services should be compliant with the protection of fundamental rights, including respect for freedom of expression, private life, cultural and linguistic diversity, freedom of arts and personal data protection;
Amendment 9 #
Draft opinion Paragraph 1 a (new) source: 650.541
2020/05/07
IMCO
79 amendments...
Amendment 1 #
Draft opinion Recital 1 a (new) 1a. Whereas respect and protection of fundamental rights and civil liberties must be a cornerstone of the revised and new rules for the digital single market that will be adopted through the Digital Services Act package;
Amendment 10 #
Draft opinion Paragraph 1 a (new) 1a. Stresses that specific attention be paid to ensure that no measures adopted affect or weaken the protection of freedom of expression;
Amendment 11 #
Draft opinion Paragraph 1 a (new) 1a. Proposes that the Digital Services Act package recognises the protection of fundamental rights on the digital single market as a key goal of the new and revised rules and considers that any legislative proposal put forward as part of the package should expressly address and make reference to any fundamental rights issue it might pose, in particular when such proposal sets out any restriction to fundamental rights or civil liberties.
Amendment 12 #
Draft opinion Paragraph 1 b (new) 1b. Believes that the Digital Services Act should respect the broad framework of fundamental European rights of users and consumers, such as the protection of privacy, non-discrimination, dignity, and free speech;
Amendment 13 #
Draft opinion Paragraph 2 2. States that limited liability provisions as set out in the e-Commerce Directive1 must be
Amendment 14 #
Draft opinion Paragraph 2 2. States that limited liability provisions as set out in the e-Commerce Directive1 must be maintained and strengthened in the Digital Services Act, including the long-established principle prohibiting general monitoring obligations, particularly in order to protect freedom of
Amendment 15 #
Draft opinion Paragraph 2 2. States that in general the limited liability provisions as set out in the e- Commerce Directive1
Amendment 16 #
Draft opinion Paragraph 2 2. States that limited liability provisions as set out in the e-Commerce Directive1 must be maintained
Amendment 17 #
Draft opinion Paragraph 2 2. States that limited liability provisions as set out in the e-Commerce Directive1 must be maintained and
Amendment 18 #
Draft opinion Paragraph 2 2. States that limited liability provisions as set out in the e-Commerce Directive1 must be
Amendment 19 #
Draft opinion Paragraph 2 2. States that limited liability provisions for passive hosting service providers as set out in the e-Commerce Directive1 must be maintained
Amendment 2 #
Draft opinion Recital 1 b (new) 1b. Whereas the fight against illegal or harmful content online can be a pretext for governments to restrict freedom of expression and persecute legitimate dissent and opposition;
Amendment 20 #
Draft opinion Paragraph 2 a (new) 2a. Calls on the Commission to introduce provisions protecting consumers from harmful microtargeting; in this respect, believes that specific limitations, i.e. of microtargeting based on characteristics exposing physical or psychological vulnerabilities, transparency obligations in regard to algorithms used by platforms and adequate tools empowering users to enforce fundamental rights online, are necessary in order to protect consumer rights effectively;
Amendment 21 #
Draft opinion Paragraph 2 a (new) 2a. Recalls that content removal mechanisms that are used outside the guarantees of a due process contravene Article 10 of the European Convention on Human Rights.
Amendment 22 #
Draft opinion Paragraph 2 a (new) 2a. States that Digital Services Act should maintain the ban on general monitoring obligation under the Article 15 of the current e-Commerce Directive;
Amendment 23 #
Draft opinion Paragraph 2 a (new) 2a. Stresses that the Digital Services Act must take into account the General Data Protection Regulation;
Amendment 24 #
Draft opinion Paragraph 3 3. Recognises that
Amendment 25 #
Draft opinion Paragraph 3 3. Recognises that SMEs and large players have differing capabilities with regard to the moderation of content
Amendment 26 #
Draft opinion Paragraph 3 3. Recognises that SMEs and large players have differing capabilities with regard to the moderation of content; warns th
Amendment 27 #
Draft opinion Paragraph 3 3. Recognises that SMEs
Amendment 28 #
Draft opinion Paragraph 3 3. Recognises that SMEs and large players have differing capabilities with regard to the moderation of content; warns that overburdening businesses with disproportionate new obligations could further hinder the growth of SMEs and require recourse to automatic filtering tools, which may often lead to the removal of legal content and may prevent them from entering the market, undermining the freedom to provide services; underlines in this regard the importance of stimulating the emergence of SMEs by lowering and removing market access barriers, accordingly;
Amendment 29 #
Draft opinion Paragraph 3 3. Recognises that SMEs and large players have differing capabilities with regard to the moderation of content; warns that overburdening businesses with disproportionate new obligations could further hinder the growth of SMEs and require recourse to automatic filtering tools, which may often lead to the removal of legal content; therefore, demands that when automatic filtering tools are applied, robust safeguards for transparency and accountability should be introduced with highly skilled independent and impartial public oversight;
Amendment 3 #
Draft opinion Recital 1 c (new) 1c. Whereas content moderation by content hosting platforms can result in illegitimate restrictions to freedom of expression either deliberately or through the use of automated content moderation mechanisms;
Amendment 30 #
Draft opinion Paragraph 3 3. Recognises that SMEs and large players have differing capabilities with regard to the moderation of content; warns th
Amendment 31 #
Draft opinion Paragraph 3 3. Recognises that online intermediaries, including microcompanies, SMEs and large players have differing capabilities with regard to the moderation of content; warns that overburdening businesses with disproportionate new obligations could further hinder the growth of SMEs and require recourse to automatic filtering tools, which may often lead to the removal of legal content;
Amendment 32 #
Draft opinion Paragraph 3 3. Recognises that SMEs and large players have differing capabilities with regard to the moderation of content;
Amendment 33 #
Draft opinion Paragraph 3 3. Recognises that SMEs and large players have differing capabilities with regard to the moderation of content; warns that overburdening businesses with disproportionate new obligations could further hinder the growth of SMEs
Amendment 34 #
Draft opinion Paragraph 3 a (new) 3a. Considers that ex ante competition rules could complement the enforcement of ex post competition rules to address abusive behaviours and other market distortions, which hamper a fair competition and a level playing field in the digital single market, with the ultimate aim of protecting the consumers’ freedoms and fundamental rights;
Amendment 35 #
3a. Recalls that the e-commerce directive is the legal framework for online services in the Internal Market that regulates content management; stresses that any fragmentation of that framework, resulting from the revision of the e- commerce directive should be avoided;
Amendment 36 #
Draft opinion Paragraph 4 4. Notes th
Amendment 37 #
Draft opinion Paragraph 4 4. Notes the significant differences between digital services and calls for the avoidance of a one-size-fits-all approach
Amendment 38 #
Draft opinion Paragraph 4 a (new) 4a. Considers that access to the internet is now a fundamental right; notes that European citizens now expect to be able to contact the emergency services wherever they are, even if they do not necessarily know what network coverage they will have as they travel around Europe; considers that access to the telephone and internet network is now essential for reasons of internal market access and security; points out that the technology now exists for 100% coverage of a national network and that operating licences should impose this service requirement on operators.
Amendment 39 #
Draft opinion Paragraph 5 5. Recalls the fact that
Amendment 4 #
Draft opinion Recital 1 d (new) 1d. Whereas content can be illegal for a number of reasons such as interference with an individual’s fundamental rights, hate speech considerations, connection to criminal activity, damage to a person’s legitimate interests, damage to general interests, etc. and thus not all allegedly illegal content is equally dangerous and potentially harmful;
Amendment 40 #
Draft opinion Paragraph 5 5. Recalls the fact that
Amendment 41 #
Draft opinion Paragraph 5 5. Recalls the fact that
Amendment 42 #
Draft opinion Paragraph 5 5. Recalls the fact that misinformative and harmful content is not always illegal and that not all types of illegal content are harmonised at EU level; calls, therefore, for the establishment of a well-defined notice-and-takedown process; supports an intensive dialogue between
Amendment 43 #
Draft opinion Paragraph 5 5. Recalls the fact that misinformative and harmful content is not always illegal; calls, therefore, for the establishment of a
Amendment 44 #
Draft opinion Paragraph 5 5. Recalls the fact that mis
Amendment 45 #
Draft opinion Paragraph 5 a (new) 5a. Notes that information is sometimes manipulated by dominant media outlets whose loss-making economic models make them dependent on State aid; considers that the Commission authorises this type of public financing too easily without checking whether these subsidised outlets have control over all the information they produce; calls for the scope of the arrangements for monitoring the granting of State aid to be extended to cover such media outlets and public service broadcasters.
Amendment 46 #
Draft opinion Paragraph 5 a (new) 5a. Stresses the need to distinguish between ‘illegal’, ‘harmful’, and other content; notes that while some content linked to religious belief or political positions for instance might be considered harmful without being illegal; considers that harmful legal content should not be regulated or defined in the Digital Service Act;
Amendment 47 #
Draft opinion Paragraph 5 a (new) 5a. Notes the importance of adopting further rules on digital advertising and ensuring full compliance with GDPR provisions, in order to avoid intrusive business models, behavioural manipulation and discriminatory practices, which have major effects on the Single Market and users’ fundamental rights, privacy and data security;
Amendment 48 #
Draft opinion Paragraph 5 a (new) 5a. Notes the proliferation of fake news and disinformation with false or misleading content, and consumers scams by means of unsafe or counterfeit products; calls on the Commission to keep working and exploring new ways to combat fake news while preserving fundamental rights;
Amendment 49 #
Draft opinion Paragraph 5 b (new) 5b. Recalls its opposition to the granting of State aid to certain sectors of the press which then gain an advantage over others not so favoured; stresses that the resulting distortion of competition in the internal market is particularly unfair given the many digital news service providers which are excluded from such aid despite performing a similar activity and having a comparable, if not larger, target audience.
Amendment 5 #
Draft opinion Paragraph 1 1. Welcomes the Commission’s
Amendment 50 #
Draft opinion Paragraph 5 b (new) 5b. Highlights that tracking applications must respect privacy rights in addition to the provisions of the GDPR;
Amendment 51 #
Draft opinion Paragraph 6 6. Calls for the introduction of counter-notice tools to allow content owners to defend their rights adequately and in a timely manner when notified of any takedown; underlines
Amendment 52 #
Draft opinion Paragraph 6 6. Calls for the introduction of
Amendment 53 #
Draft opinion Paragraph 6 6. Calls for the introduction of counter-notice tools to allow
Amendment 54 #
Draft opinion Paragraph 6 6. Calls for the introduction of appropriate safeguards, due process obligations and counter
Amendment 55 #
Draft opinion Paragraph 6 6. Calls for the introduction of appropriate safeguards, due process obligations and counter
Amendment 56 #
Draft opinion Paragraph 6 6. Calls for the introduction of counter-notice tools to allow content
Amendment 57 #
Draft opinion Paragraph 6 6. Calls for the introduction of counter-notice tools to allow content owners to defend their rights adequately and in a timely manner when notified of any takedown; underlines its view that delegating the responsibility to set boundaries on freedom of speech only to private companies is unacceptable and creates risks for both citizens and businesses
Amendment 58 #
6a. Believes that allowing new innovative business model to flourish and strengthening the Digital Single Market by removing barriers to the free movement of digital content, barriers which creates national fragmented markets and a demand for illegal content, have been proven to work in the past, especially in relation to the infringements of Intellectual Property rights;
Amendment 59 #
Draft opinion Paragraph 6 a (new) 6a. Considers that the introduction of digital services has to be done at the same time with the establishment of some measures which aim to develop the knowledge and skills of citizens and small businesses on digitization, in order to ensure that all citizens have access to these services;
Amendment 6 #
Draft opinion Paragraph 1 1. Welcomes the Commission’s intention to introduce a harmonised approach addressing obligations
Amendment 60 #
Draft opinion Paragraph 6 a (new) 6a. Notes that the new Digital Services Act should also address the challenges algorithms present in terms of ensuring non-discrimination, transparency and explainability, as well as liability; points out the need to monitor algorithms and to asses associated risks, to use high quality and unbiased datasets, as well as to help individuals acquire access to diverse content, opinions, high quality products and services;
Amendment 61 #
Draft opinion Paragraph 6 a (new) 6a. Acknowledges the fact that, while the illegal nature of certain types of content can be easily established, the decision is less clear-cut for other types of content as it requires contextualisation; warns that some automated tools are not sufficiently sophisticated to take context into account, which could lead to unnecessary restrictions being placed on freedom of expression.
Amendment 62 #
Draft opinion Paragraph 6 a (new) 6a. Calls for the introduction of transparency and accountability requirements regarding the decision- making processes for content flagging for content hosting providers and providers of automated content recognition tools, including the public documentation of, at minima, the existence and the functioning of content recognition technologies ; the transparency requirements for content hosting providers should include the number of all received notices, the reasons for determining the legality of content or how it infringes terms of service, the concrete time frames for notifying the uploader, the number of appeals they received and how they were resolved and the number of erroneous takedowns;
Amendment 63 #
Draft opinion Paragraph 6 a (new) 6a. Considers that practices like profiling deeply interfere with people's rights and freedoms; recognizes that the General Data Protection Regulation framework does not adequately protect consumers against profile building and unjustified automated decisions; therefore is the opinion that in order to ensure adequate protection of consumers, personal data should only be used where it’s necessary to provide the service requested;
Amendment 64 #
Draft opinion Paragraph 6 a (new) 6a. Recalls that the current notice and take down mechanism does not prevent illegal content that was previously taken down to be re-uploaded; stresses the scale of services and content available online nowadays; calls therefore for a strengthened framework that foresees a notice and stay down mechanism in order to ensure that online service providers take effective measures to ensure that illegal content that was rightfully removed from their services following a notice, stays down.
Amendment 65 #
Draft opinion Paragraph 6 a (new) 6a. Recommends that the Digital Services Act package introduces harmonized standards and procedure for tackling illegal content online, applicable to content moderation by content hosting platforms -on the basis of notice-and- action mechanisms- and to content supervision by any national authority, which ensure sufficient safeguards against abusive restrictive measures, including effective control by a court or other independent adjudicatory body; and considers that such standards should differentiate and set out different approaches depending on the protected interest the allegedly illegal content could potentially damage.
Amendment 66 #
Draft opinion Paragraph 6 b (new) 6b. Notes that the lack of access to digital services for certain categories of citizens would lead to the exclusion and violation of equal market and information access rights;
Amendment 67 #
Draft opinion Paragraph 6 b (new) 6b. Calls for transparency obligations for recommendation systems of content hosting providers which should take the form of real-time, high-level, anonymised data access through public APIs and include public documentation of recommendation outputs and their audiences, content-specific ranking decisions and other interventions by the platform as well as the organisational structures that control such systems;
Amendment 68 #
Draft opinion Paragraph 6 b (new) 6b. Stresses that access to the services of a digital network should allow, on equal subscription terms, identical and indiscriminate access to the whole network; calls, therefore, for a ban on the practices of setting content quotas or profiling on the basis of users’ characteristics, opinions or interests, and on the practices of blanking and shadow banning unless justified on the grounds of clearly identified misconduct on the part of the user.
Amendment 69 #
Draft opinion Paragraph 6 b (new) 6b. Remarks that interference with freedom of expression must be prescribed by law and thus insists that nothing in the Digital Services Act package should allow authorities to order the removal of legal content on the grounds that it is “harmful” or “fake”.
Amendment 7 #
Draft opinion Paragraph 1 1. Welcomes the Commission’s intention to introduce a harmonised approach addressing obligations
Amendment 70 #
Draft opinion Paragraph 6 c (new) 6c. Considers that the EU body tasked with the oversight of the application of the standards and procedure for tackling illegal content online should monitor and control both content moderation by content hosting platforms and content supervision by national authorities.
Amendment 71 #
Draft opinion Paragraph 6 c (new) 6c. Emphasises the right to education and cultural identity and the need for an honest presentation of online content, particularly as regards the classification of works of fiction as ‘recreational’ or ‘historical’ products.
Amendment 72 #
Draft opinion Paragraph 6 c (new) 6c. Calls for a risk assessment obligation for automated content flagging tools employed by content hosting providers, which would work in a co- regulatory fashion and would be conducted in regular time frames prescribed by law;
Amendment 73 #
Draft opinion Paragraph 6 d (new) 6d. Calls for the establishment of an EU body tasked with monitoring and ensuring compliance with the provisions of future regulation, including the screening of transparency reports and carrying out audits of algorithms provided to and employed by content hosting providers for content flagging and recommendation systems;
Amendment 74 #
Draft opinion Paragraph 6 e (new) 6e. Calls for the establishment of socially representative and divers, in particular gender balanced, co-regulatory social media councils as a multi- stakeholder mechanism, which would provide for an open, transparent, accountable and participatory forum to address content moderation principles;
Amendment 75 #
Draft opinion Paragraph 6 f (new) 6f. Calls on Member States to establish independent dispute settlement bodies which would be tasked to settle disputes between uploaders, notifiers and content hosting providers;
Amendment 76 #
Draft opinion Paragraph 6 g (new) 6g. Calls on the Commission to include the principle of “freedom of expression by design” whereby the content hosting provider shall, both at the time of the design stage of the service and at the time of the provision of the service itself, implement appropriate technical and organisational measures in an effective manner and integrate the necessary safeguards into the provision of the service in order to meet the requirements of any future regulation and protect consumer rights;
Amendment 77 #
Draft opinion Paragraph 6 h (new) 6h. Calls on the Commission to introduce minimum standards for the internal rules, such as terms and conditions or community guidelines, of content hosting providers and providers of content moderation tools to provide for safeguards for fundamental rights, in particular with regard to transparency, accessibility, fairness, predictability and non-discriminatory enforcement;
Amendment 78 #
Draft opinion Paragraph 6 i (new) 6i. Recalls that profiling coupled with targeted advertisements not only undermines the democratic framework, but also leads to an unfair competitive advantage for dominant private actors collecting huge amounts of data and therefore calls on the Commission to limit targeted advertisement in order to create a level playing field for SMEs;
Amendment 79 #
Draft opinion Paragraph 6 j (new) 6j. Recalls that the choice of algorithmic tools for recommendation systems raises accountability and transparency concerns; therefore stresses the need to guarantee the right of users to opt out from recommended and personalised services;
Amendment 8 #
Draft opinion Paragraph 1 1. Welcomes the Commission’s intention to introduce a harmonised approach addressing obligations imposed on intermediaries, in order to avoid fragmentation of the internal market; stresses that any measure related to fundamental rights should
Amendment 9 #
Draft opinion Paragraph 1 1. Welcomes the Commission’s intention to introduce a harmonised approach addressing obligations imposed on intermediaries, in order to avoid fragmentation of the internal market; stresses that any measure related to fundamental rights should be carefully balanced and take into account the possible impact on the functioning of the internal market and ensuring consumer protection, and calls on the Commission to avoid the ‘export’ of national regulations and instead to propose the most efficient and effective solutions for the internal market as a whole;
source: 650.628
2020/06/24
LIBE
267 amendments...
Amendment 1 #
Motion for a resolution Citation 3 — having regard to the Charter of Fundamental Rights of the European Union, in particular Article 6, Article 7, Article 8, Article 11, Article 13, Article 2
Amendment 10 #
Motion for a resolution Citation 7 d (new) — having regard to Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market;
Amendment 100 #
Motion for a resolution Paragraph 2 b (new) 2b. Requests that digital services should to the maximum extent possible be accessible without the need for users to reveal their identity.
Amendment 101 #
Motion for a resolution Paragraph 2 c (new) 2c. Emphasises that there are certain differences still between online and offline worlds, for instance, in terms of anonymity, the absence of a governing entity, between the balances of power and technical capabilities; Calls therefore on the Commission to let the principles of human dignity and 'what is illegal offline is illegal online' prevail in its DSA- proposal and to introduce in the DSA the concept of digital dignity, which builds upon these principles and embodies the fundamental rights of individuals;
Amendment 102 #
Motion for a resolution Paragraph 2 c (new) 2c. Reiterates that digital service providers must respect and enable their users’ right to data portability as laid down in Union law.
Amendment 103 #
Motion for a resolution Paragraph 3 3. Deems it necessary that illegal content is removed
Amendment 104 #
Motion for a resolution Paragraph 3 3. Deems it necessary that illegal content is removed swiftly and consistently
Amendment 105 #
Motion for a resolution Paragraph 3 3. Deems it necessary that illegal content is removed swiftly and consistently in order to address crimes and fundamental rights violations; considers that voluntary codes of conduct only partially address the issue and that a more effective liability regime for platforms should be introduced;
Amendment 106 #
Motion for a resolution Paragraph 3 3. Deems it necessary that flagrantly illegal content is removed swiftly and consistently in order to address crimes and
Amendment 107 #
Motion for a resolution Paragraph 3 3. Deems it necessary that illegal
Amendment 108 #
Motion for a resolution Paragraph 3 3. Deems it necessary that illegal content is removed swiftly and consistently in order to address crimes, especially those relating to children and fundamental rights violations; considers that voluntary codes of conduct only partially address the issue;
Amendment 109 #
Motion for a resolution Paragraph 3 3. Deems it necessary that illegal content is removed
Amendment 11 #
Motion for a resolution Citation 7 a (new) — having regard to the Commission recommendation of 1 March 2018 on measures to effectively tackle illegal content online (C(2018) 1177 final);
Amendment 110 #
Motion for a resolution Paragraph 3 a (new) 3a. Calls on digital service providers to take content offline in a diligent, proportionate and non-discriminatory manner, and with due regard in all circumstances to the fundamental rights of the users and to take into account the fundamental importance of the freedom of expression and information in an open and democratic society with a view to avoiding the removal of content, which is not illegal. Requests digital service providers, which on their own initiative want to restrict certain legal content of their users, to explore the possibility of labelling rather than taking offline that content, giving users the chance to self- responsibly choose to access that content.
Amendment 111 #
Motion for a resolution Paragraph 4 4. Recalls that illegal
Amendment 112 #
Motion for a resolution Paragraph 4 4. Recalls that illegal content online should not only be removed by online platforms, but should be followed up by law enforcement and the judiciary; finds, in this regard, that a key issue in some Member States is not that they have unresolved cases but rather unopened ones; considers that providing national judicial services with specialised staff and adequate financial resources is key to improving access to and the efficiency of the justice system in the area of digital services; calls for barriers to filing complaints with competent authorities to be removed; is convinced that, given the borderless nature of the internet and the fast dissemination of illegal content online, cooperation between service providers and national competent
Amendment 113 #
Motion for a resolution Paragraph 4 4. Recalls that illegal content online should not only be removed by online platforms, but should be followed up by law enforcement and the judiciary; finds, in this regard, that a key issue in some Member States is not that they have unresolved cases but rather unopened ones; calls for barriers to filing complaints with competent authorities to be removed; is convinced that, given the borderless nature of the internet and the fast dissemination of illegal content online, cooperation between service providers and national competent authorities should be improved; calls, to this end, on Member States to equip their law enforcement and judicial authorities with the necessary expertise, resources and tools to allow them to effectively deal with the increasing number of cases involving illegal content online;
Amendment 114 #
Motion for a resolution Paragraph 4 4. Recalls that illegal content online should not
Amendment 115 #
Motion for a resolution Paragraph 4 4.
Amendment 116 #
Motion for a resolution Paragraph 4 4. Recalls that illegal content online should not only be removed by online platforms, but should be followed up by law enforcement and the judiciary where criminal acts are concerned ; finds, in this regard, that a key issue in some Member States is not that they have unresolved cases but rather unopened ones; calls for barriers to filing complaints with competent authorities to be removed; is convinced that, given the borderless nature of the internet and the fast dissemination of illegal content online, cooperation between service providers and national competent authorities should be
Amendment 117 #
Motion for a resolution Paragraph 4 – subparagraph 1 (new) Is convinced that, given the borderless nature of the internet and the fast dissemination of illegal content online, cooperation between service providers and national competent authorities should be improved;
Amendment 118 #
Motion for a resolution Paragraph 4 a (new) 4a. Stresses that proportionate sanctions should be applied to violations of the law, which shall not encompass excluding individuals from digital services;
Amendment 119 #
Motion for a resolution Paragraph 5 5. Acknowledges the fact that, while the illegal nature of certain
Amendment 12 #
Motion for a resolution Citation 7 a (new) — having regard to the judgement of the Court of Justice of 24 November 2011 in case C-70/105a, _________________ 5aJudgement of the Court of Justice of 24 November 2011, Scarlet Extended SA v Société belge des auteurs, compositeurs et éditeurs SCRL (SABAM)
Amendment 120 #
Motion for a resolution Paragraph 5 5. Acknowledges the fact that
Amendment 121 #
Motion for a resolution Paragraph 5 5. Acknowledges the fact that, while the illegal nature of certain types of content can be easily established, the decision is more difficult for other types of content as it requires contextualisation; warns that some automated tools are not sophisticated enough to take contextualisation into account, which could lead to unnecessary
Amendment 122 #
Motion for a resolution Paragraph 5 5. Acknowledges the fact that, while the illegal nature of certain types of content can be easily established, the decision is more difficult for other types of content as
Amendment 123 #
Motion for a resolution Paragraph 5 5. Acknowledges the fact that, while the illegal nature of certain types of content can be easily established, the decision is more difficult for other types of content as it requires contextualisation; warns that some automated tools are not sophisticated enough to take contextualisation into account, which could lead to unnecessary restrictions being placed on the freedom of expression; considers that the use of artificial intelligence in this area must comply with EU data protection and transparency legislation and principles and must be subject to human supervision;
Amendment 124 #
Motion for a resolution Paragraph 5 5. Acknowledges the fact that, while the illegal nature of certain types of content can be easily established, the decision is more difficult for other types of content as it requires contextualisation; warns that some automated tools are not sophisticated enough to take contextualisation into account, which could lead to unnecessary and harmful restrictions being placed on the freedom of expression, political views and the right to receive a variety of often controversial information, leading to the filtering and censorship of the internet;
Amendment 125 #
Motion for a resolution Paragraph 5 5. Acknowledges the fact that, while the illegal nature of certain types of content can be easily established, the decision is more difficult for other types of content as it requires contextualisation; considers it necessary to provide a clear definition of ‘illegal’ content and ‘dangerous’ content; warns that some automated tools are not sophisticated enough to take contextualisation into account, which could lead to unnecessary restrictions being placed on the freedom of expression;
Amendment 126 #
Motion for a resolution Paragraph 5 – subparagraph 1 (new) considers, in this regard, that other stakeholders in the online ecosystem, such as users, right holders and media, can also play an important role in establishing whether content is illegal based on the specific context; invites these stakeholders to cooperate closely and exchange information with platforms to help them to effectively identify and address illegal content;
Amendment 127 #
Motion for a resolution Paragraph 5 a (new) 5a. Stresses that the removal of online content must be consistent with the freedoms of expression and communication;
Amendment 128 #
Motion for a resolution Paragraph 6 6. Underlines that a specific piece of information may be deemed illegal in one Member State but is covered by the right to freedom of expression in another; highlights that in order to protect freedom of speech standards, to avoid conflicts of laws, to avert unjustified and ineffective geo-blocking and to aim for a harmonised digital single market hosting service providers shall not be required to remove or disable access to information that is legal in their country of origin;
Amendment 129 #
Motion for a resolution Paragraph 6 6. Underlines that a specific piece of information may be deemed illegal in one Member State but is covered by the right to freedom of expression in another; stresses, therefore, that national authorities should only be allowed to address and enforce removal orders to service providers established in their territory;
Amendment 13 #
Motion for a resolution Citation 7 b (new) — having regard to the Europol Internet Organised Crime Threat Assessment (IOCTA) of 18 September 2018;
Amendment 130 #
Motion for a resolution Paragraph 6 6. Underlines that a specific piece of information may be deemed illegal in one Member State but is covered by the right to
Amendment 131 #
Motion for a resolution Paragraph 6 a (new) 6a. Is convinced that digital service providers must not be mandated to apply one Member State’s national restrictions on freedom of speech in another Member State where that restriction does not exist.
Amendment 132 #
Motion for a resolution Paragraph 6 a (new) 6a. Underlines that illegal content should be removed where it is hosted, and that mere conduit intermediaries shall not be required to block access to content;
Amendment 133 #
7. Strongly believes that the current EU legal framework governing digital services should be updated with a view to addressing the challenges posed by new technologies and ensuring legal clarity
Amendment 134 #
Motion for a resolution Paragraph 7 7. Strongly believes that the current EU legal framework governing digital services should be updated with a view to addressing the challenges posed by new technologies such as the prevalence of all- encompassing profiling and algorithmic decision-making that permeates all areas of life, and ensuring legal clarity and respect for fundamental rights; considers that the reform should build on the solid foundation of and full compliance with existing EU law, especially the General Data Protection Regulation and the Directive on privacy and electronic communications;
Amendment 135 #
Motion for a resolution Paragraph 7 7. Strongly believes that the current EU legal framework governing digital services should be updated with a view to addressing the challenges posed by new technologies and ensuring legal clarity and respect for fundamental rights; considers that the reform should build on the solid foundation of and full compliance with existing EU law, especially the General Data Protection Regulation
Amendment 136 #
Motion for a resolution Paragraph 7 7. Strongly believes that the current EU legal framework governing digital services should be updated with a view to addressing the challenges posed by the fragmentation between the Member States and new technologies, a
Amendment 137 #
Motion for a resolution Paragraph 7 7. Strongly believes that the current EU legal framework governing digital services should be updated with a view to addressing the challenges posed by new technologies and ensuring legal clarity and respect for fundamental rights; considers that the reform should build on the solid foundation of and full compliance with existing EU law, especially the General Data Protection Regulation
Amendment 138 #
Motion for a resolution Paragraph 7 7. Strongly believes that the current EU legal framework governing digital
Amendment 139 #
Motion for a resolution Paragraph 7 a (new) 7a. Highlights that the practical capacity of individuals to understand and navigate the complexity of the data ecosystems in which they are embedded is extremely limited, as is their ability to identify whether the information they receive and services they use are made available to them on the same terms as to other users; Calls on the Commission therefore to place transparency and non- discrimination at the heart of the Digital Services Act;
Amendment 14 #
Motion for a resolution Citation 8 Amendment 140 #
Motion for a resolution Paragraph 8 8. Deems it indispensable to have the widest-possible harmonisation and clarification of rules on liability
Amendment 141 #
Motion for a resolution Paragraph 8 8. Deems it indispensable to have the widest-possible harmonisation and clarification of rules on liability
Amendment 142 #
Motion for a resolution Paragraph 8 8. Deems it indispensable to have the widest-possible harmonisation of rules on liability exemptions and content moderation at EU level to guarantee the respect of fundamental rights and the freedoms of users across the EU; expresses its concern that recent national laws to tackle hate speech and disinformation lead to an increasing fragmentation of rules;
Amendment 143 #
Motion for a resolution Paragraph 8 8. Deems it indispensable to have
Amendment 144 #
Motion for a resolution Paragraph 9 9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by
Amendment 145 #
Motion for a resolution Paragraph 9 9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by requiring digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to pre
Amendment 146 #
Motion for a resolution Paragraph 9 9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by requiring digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to remove illegal content in line with
Amendment 147 #
Motion for a resolution Paragraph 9 9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by requiring digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to remove illegal content in line with European values, while also establishing a procedure for collaboration with law enforcement and judicial authorities; firmly believes that this should be harmonised within the digital single market;
Amendment 148 #
Motion for a resolution Paragraph 9 9. Calls, to this end,
Amendment 149 #
Motion for a resolution Paragraph 9 9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by requiring digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to
Amendment 15 #
Motion for a resolution Citation 8 — having regard to the
Amendment 150 #
Motion for a resolution Paragraph 9 9. Calls, to this end, for legislative proposals that keep the digital single market open and competitive by requiring digital service providers to apply effective, coherent, transparent and fair procedures and procedural safeguards to remove illegal content in line with European values and law; firmly believes that this should be harmonised within the digital single market;
Amendment 151 #
Motion for a resolution Paragraph 10 10. Believes
Amendment 152 #
Motion for a resolution Paragraph 10 10. Believes, in this regard, that online platforms that are actively hosting or moderating content should bear more, yet proportionate, responsibility for the infrastructure they provide and the content on it; emphasises that this should be achieved without resorting to general monitoring requirements; proposes the implementation of a common and permanent liability framework for platforms in order to effectively identify and remove illegal content; considers, in particular, that a harmonised EU framework should be based on due diligence obligations so that platforms implement proactive and effective measures, in addition to their obligations relating to transparency and information; considers it important to accompany 'notification and action' procedures for identifying new content with an obligation to monitor content which has already been deemed illegal and removed, in order to prevent it from reappearing online;
Amendment 153 #
Motion for a resolution Paragraph 10 10. Believes, in this regard, that large online platforms that are actively hosting
Amendment 154 #
Motion for a resolution Paragraph 10 10. Believes, in this regard, that online platforms that are
Amendment 155 #
Motion for a resolution Paragraph 10 10. Believes, in this regard, that
Amendment 156 #
Motion for a resolution Paragraph 10 a (new) 10a. Stresses that public authorities must not impose any obligation on digital service providers, neither de jure nor de facto, to monitor the information which they transmit or store, nor a general obligation to seek, moderate or filter content indicating illegal activity.
Amendment 157 #
Motion for a resolution Paragraph 10 b (new) 10b. Is convinced that digital service providers must not be required to prevent the upload of illegal content. Believes at the same time, where technologically feasible, based on sufficiently substantiated orders by democratically accountable competent public authorities, and taking full account of the specific context of the content, that digital service providers may be required to execute periodic searches for distinct pieces of content, which, in line with the ECJ Judgment in Case C-18/18, are identical or equivalent to content that a court had already declared unlawful, and to take that content offline.
Amendment 158 #
Motion for a resolution Paragraph 11 11.
Amendment 159 #
Motion for a resolution Paragraph 11 11.
Amendment 16 #
Motion for a resolution Recital -A (new) -A. whereas fundamental rights, such as protection of privacy and personal data, the principle of non-discrimination, as well as freedom of expression and information, need to be ingrained at the core of a successful and durable European policy on digital services; whereas these rights need to be seen both in the letter of the law, as well as the spirit of their implementation;
Amendment 160 #
Motion for a resolution Paragraph 11 11. Highlights that this should include rules on the notice-and-action mechanisms and requirements for platforms to take proactive measures that are proportionate to their scale of reach and operational capacities in order to address and prevent the appearance of illegal content on their services, considers that this should entail an obligation for platforms to detect and remove reliably identified Child Sexual Abuse Material (CSAM); supports a balanced duty-of-care approach and a clear chain of responsibility to avoid unnecessary regulatory burdens for the platforms and unnecessary and disproportionate restrictions on fundamental rights, including the freedom of expression;
Amendment 161 #
Motion for a resolution Paragraph 11 11. Highlights that this should include rules on the notice-and-action mechanisms and requirements for platforms to take
Amendment 162 #
Motion for a resolution Paragraph 11 11. Highlights that this should include rules on the notice-and-action mechanisms and requirements for platforms to take proactive measures that are proportionate to their scale of reach and operational
Amendment 163 #
Motion for a resolution Paragraph 11 11.
Amendment 164 #
Motion for a resolution Paragraph 11 11. Highlights that this should include rules on the notice-and-action mechanisms and requirements for platforms to take proactive measures that are proportionate to their scale of reach and operational capacities in order to effectively address the appearance of illegal content on their services, while leaving the choice of the concrete measures to the platforms; supports a balanced duty-of-care approach and a clear chain of responsibility to avoid unnecessary regulatory burdens for the platforms and unnecessary and disproportionate restrictions on fundamental rights, including the freedom of expression;
Amendment 165 #
Motion for a resolution Paragraph 11 11. Highlights that th
Amendment 166 #
Motion for a resolution Paragraph 11 a (new) 11a. Considers that greater regulatory clarity and dialogue with stakeholders is needed to encourage information society service providers to engage in additional voluntary activities to moderate their content in accordance with the law;
Amendment 167 #
Motion for a resolution Paragraph 12 12. Stresses the need for appropriate safeguards and due process obligations, including human oversight and verification, in addition to counter notice procedures, to ensure that removal or blocking decisions are
Amendment 168 #
Motion for a resolution Paragraph 12 12. Stresses the need for appropriate safeguards and due process obligations, including a requirement for human oversight and verification, in addition to counter notice procedures, to ensure that removal or blocking decisions are accurate, well-
Amendment 169 #
Motion for a resolution Paragraph 12 12. Stresses the need for appropriate safeguards and due process obligations, including human oversight and verification, in addition to counter notice procedures, to ensure that removal or blocking decisions are accurate, well- founded and respect fundamental rights; recalls that the possibility of judicial redress, following the final decision taken by the platforms in accordance with the internal complaints system, should be made available to satisfy the right to effective remedy;
Amendment 17 #
Motion for a resolution Recital A b (new) Ab. recital -Aa whereas the trust of users can only be gained by digital services that respect their fundamental rights, thus ensuring both uptake of services, as well as a competitive advantage and stable business models for companies;
Amendment 170 #
Motion for a resolution Paragraph 12 12. Stresses the need for appropriate safeguards and due process obligations
Amendment 171 #
Motion for a resolution Paragraph 12 12. Stresses the need for appropriate safeguards and due process obligations, including human oversight and verification, in addition to counter notice procedures, to ensure that removal
Amendment 172 #
Motion for a resolution Paragraph 12 a (new) 12a. Stresses that, in order to protect the freedom of expression and information, it is crucial to maintain the limited liability regime for intermediaries not having knowledge of the illegal activity or information; highlights that the legal regime for digital providers liability should not depend on uncertain notions such as the ‘active’ or ‘passive’ role of providers;
Amendment 173 #
Motion for a resolution Paragraph 13 13. Supports limited liability for content and the country of origin principle, but considers improved coordination for removal requests between national competent authorities to be essential; emphasises that such orders should be subject to legal safeguards in order to prevent abuse and ensure full respect of fundamental rights; stresses that sanctions should apply to those service providers that fail to comply with legitimate orders;
Amendment 174 #
Motion for a resolution Paragraph 13 13. Supports
Amendment 175 #
13. Supports the preservation of the current framework on the limited liability for content and the country of origin principle, but considers improved coordination for removal requests between national competent authorities to be essential; emphasises that such orders should be issued by a judicial authority of the Member State in which a hosting service provider is located and subject to legal safeguards in order to prevent abuse and ensure full respect of fundamental rights; stresses that
Amendment 176 #
Motion for a resolution Paragraph 13 13. Supports limited liability for content and the country of origin principle, but considers improved coordination for removal requests between national competent authorities to be essential; emphasises that such orders should be subject to legal safeguards in order to prevent abuse and ensure full respect of fundamental rights; highlights that removal requests from competent authorities should be specific and clearly state the legal basis for removal; stresses that sanctions should apply to those service providers that fail to comply with legitimate orders;
Amendment 177 #
Motion for a resolution Paragraph 13 13. Supports limited liability for content and the country of origin principle, but considers improved coordination for removal requests between national competent authorities to be essential; emphasises that such orders should be subject to legal safeguards in order to prevent abuse and ensure full respect of fundamental rights and civil rights and freedoms; stresses that proportionate sanctions should apply to those service providers that fail to comply with legitimate orders even though they possess the technical and operational capacities;
Amendment 178 #
Motion for a resolution Paragraph 13 13. Supports limited liability
Amendment 179 #
Motion for a resolution Paragraph 13 13. Supports limited liability for content and the country of origin principle, but considers improved coordination for removal requests between national competent authorities to be essential; emphasises that such orders should be subject to legal safeguards in order to prevent abuse and ensure full respect of fundamental rights; stresses that sanctions should apply to those service providers that fail to comply with l
Amendment 18 #
Motion for a resolution Recital B B. whereas the data protection rules applicable to all providers offering digital services in the EU’s territory were recently updated and harmonised across the EU with the General Data Protection Regulation; whereas the Digital Services Act should apply without prejudice to the rules laid down in the General Data Protection Regulation and in other instruments, such as the Copyright Directive;
Amendment 180 #
Motion for a resolution Paragraph 13 a (new) 13a. Stresses that the responsibility for enforcing the law, deciding on the legality of online activities and ordering hosting service providers to remove or disable access to content as soon as possible shall rest with independent judicial authorities; only a hosting service provider that has actual knowledge of illegal content and its illegal nature shall be subject to content removal obligations
Amendment 181 #
Motion for a resolution Paragraph 13 a (new) 13a. Notes that user-oriented actions (so-called 'targeting') is an area that requires careful supervision; points out that this practice has been included in GDPR and should be properly enforced across the EU before new legislation is considered;
Amendment 182 #
Motion for a resolution Paragraph 13 a (new) 13a. Requires digital service providers that become aware of alleged illegal content of their users to notify the competent public authorities without undue delay.
Amendment 183 #
Motion for a resolution Paragraph 13 b (new) 13b. Calls on all digital actors to comply with the disclosure rules required under the GDPR, including as regards the collection and recording of users' choices and the transmission of those users' choices to technology partners before the processing of personal data;
Amendment 184 #
Motion for a resolution Paragraph 13 b (new) 13b. Requests Member States and digital service providers to put in place transparent, effective, fair, and expeditious complaint and redress mechanisms to allow users to challenge the taking offline of their content.
Amendment 185 #
Motion for a resolution Paragraph 13 c (new) Amendment 186 #
Motion for a resolution Paragraph 13 d (new) 13d. Believes that neither infrastructure service providers, payment providers, and other companies offering services to digital service providers, nor digital service providers with a direct relationship with the user must be held liable for the content a user on his own initiative uploads or downloads. Believes at the same time that digital service providers, which have a direct relationship with the user who uploaded the illegal content and which have the ability to take distinct pieces of the user’s content offline, must be held liable for failing to expeditiously respond to sufficiently substantiated orders by democratically accountable competent public authorities to take the illegal content offline.
Amendment 187 #
Motion for a resolution Paragraph 14 14. Believes that terms of services of digital service providers should be clear, transparent and fair;
Amendment 188 #
Motion for a resolution Paragraph 14 14. Believes that terms of services of digital service providers should be clear, transparent and fair and be made available in an easy and accessible manner to users; deplores the fact that some terms of services from content platforms do not allow law enforcement to use non-personal accounts, which poses a threat both to possible investigations and to personal safety;
Amendment 189 #
Motion for a resolution Paragraph 14 14. Believes that terms of services of digital service providers should be clear, transparent and fair and be made available in an easy and accessible manner to users; deplores the fact that some terms of services from content platforms do not allow law enforcement to use non-personal accounts, which poses a threat both to possible investigations and to personal safety;
Amendment 19 #
Motion for a resolution Recital B B. whereas the data protection rules applicable to all providers offering digital services in the EU’s territory were recently updated and harmonised across the EU with the General Data Protection Regulation, its enforcement needs to be strengthened;
Amendment 190 #
Motion for a resolution Paragraph 14 14. Believes that terms of services of digital service providers should be clear, transparent
Amendment 191 #
Motion for a resolution Paragraph 14 14. Believes that
Amendment 192 #
Motion for a resolution Paragraph 14 14. Believes that the terms of service
Amendment 193 #
Motion for a resolution Paragraph 14 a (new) 14a. Stresses that in line with the principle of data minimisation established by the General Data Protection Regulation, the Digital Services Act shall require intermediaries to enable the anonymous use of their services and payment for them wherever it is technically possible, as anonymity effectively prevents unauthorized disclosure, identity theft and other forms of abuse of personal data collected online; only where existing legislation requires businesses to communicate their identity, providers of major market places could be obliged to verify their identity, while in other cases the right to use digital services anonymously shall be upheld
Amendment 194 #
Motion for a resolution Paragraph 15 15. Underlines that
Amendment 195 #
Motion for a resolution Paragraph 15 15. Underlines th
Amendment 196 #
Motion for a resolution Paragraph 15 15.
Amendment 197 #
Motion for a resolution Paragraph 15 15. Underlines that certain types of legal, yet harmful, content should also be addressed to ensure a fair digital ecosystem; expects guidelines to include increased transparency rules on content moderation
Amendment 198 #
Motion for a resolution Paragraph 15 15.
Amendment 199 #
Motion for a resolution Paragraph 15 15.
Amendment 2 #
Motion for a resolution Citation 3 — having regard to the Charter of Fundamental Rights of the European Union, in particular Article 6, Article 7, Article 8, Article 11, Article 13, Article 2
Amendment 20 #
Motion for a resolution Recital B a (new) Ba. whereas the privacy rules in the electronic communication sector, as set out in the Directive concerning the processing of personal data and the protection of privacy in the electronic communications sector, are currently under revision;
Amendment 200 #
Motion for a resolution Paragraph 15 15. Underlines that certain types of legal, yet potentially harmful, content should also be addressed
Amendment 201 #
Motion for a resolution Paragraph 24 b (new) 24b. Notes the potential negative impact of personalised advertising, in particular micro-targeted and behavioural advertisements, and of assessments of individuals, especially minors, by interfering in the private life of individuals, posing questions as to the collection and use of the data used to personalise advertising, offering products or services or setting prices; Calls therefore on the Commission to introduce a phase-out prohibition on personalised advertisements, starting with minors, and a prohibition on the use of discriminatory practices for the provision of services or products;
Amendment 202 #
Motion for a resolution Paragraph 15 – subparagraph 1 (new) Maintains that these forms of harmful content include micro targeting based on characteristics exposing physical or psychological vulnerabilities, health related content such as disinformation on COVID-19 causes or remedies and emerging issues such as the organised abuse of multiple platforms, artificial intelligence applications creating fake profiles or manipulating online content; points out that special attention should be paid to harmful content in the context of minors using the internet, especially in regard to their exposure to cyberbullying, sexual harassment, pornography, violence or self-harm;
Amendment 203 #
Motion for a resolution Paragraph 15 – indent 1 (new) - Considers that, as a general principle, targeted advertising can have a positive economic and societal impact and points to the fact that GDPR needs to be fully and properly enforced to ensure the respect of users' privacy;
Amendment 204 #
Motion for a resolution Paragraph 15 a (new) 15a. Points out that the Digital Services Act shall not use the legally undefined concept of “harmful content”, but shall address the publication of content that is unlawful; emphasizes that the spreading of false and racist information on social media should be contained by giving users control over content proposed to them; stresses that curating content on the basis of tracking user actions shall require the user’s consent; proposes that users of social networks should have a right to see their timeline in chronological order; suggests that dominant platforms shall provide users with an interface to have content curated by software or services of their choice;
Amendment 205 #
Motion for a resolution Paragraph 15 a (new) 15a. Highlights how the personalisation of informational environments that data- driven profiling makes possible brings with it new capacities to manipulate individuals in subtle, yet highly effective ways; underlines that when the profiling is deployed at scale for political micro targeting to manipulate voting behaviour, it can seriously undermine the foundations of democracy; therefore expects the Commission to provide guidelines on the use of such persuasive digital technologies in electoral campaigns and political advertising policy;
Amendment 206 #
Motion for a resolution Paragraph 15 a (new) 15a. Deems that misleading or obscure political advertising is a special class of online threat because it influences the core mechanisms that enable the functioning of our democratic society, especially when such content is sponsored by third-parties and foreign actors; calls, in this regard, for the establishment of strict transparency requirements for the display of paid political advertisement;
Amendment 207 #
Motion for a resolution Paragraph 15 a (new) 15a. Calls on digital service providers to take the necessary measures to identify and label content uploaded by social bots.
Amendment 208 #
Motion for a resolution Paragraph 15 b (new) 15b. Is concerned of platforms and services that deliberately lock in their users onto that specific platform, thus amplifying their dominant market power and their ability to profile their users even more thoroughly, creating extremely invasive and revealing profiles of their users; calls therefore on the Commission to guarantee the interoperability of digital services; considers in this regard the application programming interfaces (APIs), enabling a user to interconnect between platforms and to import content moderation rules on the content they view on a platform, to be useful tools in bringing true interoperability to users and thus increasing their options to choose between different kinds of recommendation systems and services;
Amendment 209 #
Motion for a resolution Paragraph 15 b (new) 15b. Notes the potential negative impact of micro-targeted advertising, micro- targeted content curation and of assessment of individuals, especially on minors and other vulnerable groups, by interfering in the private life of individuals, posing questions as to the collection and use of the data used to target said advertising, offering products or services or setting prices; reconfirms that the ePrivacy Directive makes targeted content curation subject to an opt-in decision and is otherwise prohibited;
Amendment 21 #
Motion for a resolution Recital B a (new) Amendment 210 #
Motion for a resolution Paragraph 15 c (new) 15c. Notes that policies for monetisation of content affect what kind of content is seen by users and therefore finally also what kind of content will be uploaded by users; calls therefore for online content hosting platforms to be required to have transparent, non- discriminatory content demonetisation policies in order to guarantee fully the right to freedom of expression online;
Amendment 211 #
Motion for a resolution Paragraph 16 16.
Amendment 212 #
Motion for a resolution Paragraph 16 16. Deems that accountability- and evidence-based policy making requires robust data on the prevalence and removal of illegal content online, as well as the incidence and prevention of illegal activity online, particularly against children and other vulnerable groups;
Amendment 213 #
Motion for a resolution Paragraph 16 Amendment 214 #
Motion for a resolution Paragraph 16 16. Deems that accountability
Amendment 215 #
Motion for a resolution Paragraph 17 17. Calls, in this regard, for a regular public reporting obligation for platforms, proportionate to their scale of reach and operational capacities, more specifically on their content moderation procedures, including standardised data about the amount of content removed and the underlying reasons, the type and justification of removal requests received, the number of requests whose execution was refused and the reasons therefore;
Amendment 216 #
Motion for a resolution Paragraph 17 17. Calls, in this regard, for a regular, comprehensive and consistent public reporting obligation for platforms, proportionate to their scale of reach and operational capacities, including inter alia information on adopted measures against illegal activities online, number of removed illegal material, number and outcome of internal complaints and judicial remedy;
Amendment 217 #
Motion for a resolution Paragraph 17 17. Calls, in this regard, for a regular annual public reporting obligation for platforms, proportionate to their scale of reach and operational capacities; stresses that such reports, covering actions taken in the year preceding the year of submission, should be submitted by the end of the first quarter of that year;
Amendment 218 #
17. Calls, in this regard, for a regular public reporting obligation for platforms, proportionate to their scale of reach and operational capacities; in the case of child sexual abuse, the reporting should be done to the EU Centre for Preventing and Combating child sexual abuse;
Amendment 219 #
Motion for a resolution Paragraph 17 17. Calls, in this regard, for a regular, comprehensive and consistent public reporting obligation for platforms, proportionate to their scale of reach and operational capacities;
Amendment 22 #
Motion for a resolution Recital C C. whereas the amount of user- generated content, including harmful and illegal content, such as images depicting Child Sexual Abuse Material (CSAM) online, shared via cloud services or online platforms has increased exponentially at an unprecedented pace;
Amendment 220 #
Motion for a resolution Paragraph 17 17. Calls
Amendment 221 #
Motion for a resolution Paragraph 18 18. Calls, moreover, for a regular public reporting obligation for national authorities, including standardised data on the number of removal requests and their legal bases, on the number of removal requests which were subject to administrative or judicial remedies, on the outcome of these proceedings, and on the total number of decisions imposing penalties, including a description of the type of penalty imposed;
Amendment 222 #
Motion for a resolution Paragraph 18 18. Calls, moreover, for a regular public reporting obligation for national authorities, including inter alia information on the number of removal orders, on the number of identified illegal content or activities which led to investigation and prosecution, and the number of cases of content or activities wrongly identified as illegal;
Amendment 223 #
Motion for a resolution Paragraph 18 18. Calls, moreover, for a regular public reporting obligation for national authorities, including inter alia information on the number of removal orders, on the number of identified illegal content or activities which led to investigation and prosecution, and the number of cases of content or activities wrongly identified as illegal;
Amendment 224 #
Motion for a resolution Paragraph 18 18. Calls, moreover, for a regular public reporting obligation for national authorities on their requests for deletion of illegal content from digital platforms;
Amendment 225 #
Motion for a resolution Paragraph 18 18. Calls, moreover, for a regular annual public reporting obligation for national
Amendment 226 #
Motion for a resolution Paragraph 19 19. Expresses its concern regarding the
Amendment 227 #
Motion for a resolution Paragraph 19 19. Expresses its concern regarding the fragmentation of public oversight and supervision of online platforms and other digital service providers and the frequent lack of financial and human resources
Amendment 228 #
Motion for a resolution Paragraph 19 a (new) 19a. Considers that in order to guarantee proper enforcement of the Digital Services Act, the oversight of compliance with this Act should be entrusted in an independent authority, while any decisions relating to content should always remain with the judiciary; emphasises in this regard that sanctioning for non-compliance with the Digital Services Act should be based on an assessment of a clearly defined set of factors, such as proportionality, technical and organisational measures and negligence, and the resulting sanctions should be based on a percentage of the annual global turnover of a company;
Amendment 229 #
Motion for a resolution Paragraph 20 Amendment 23 #
Motion for a resolution Recital C C. whereas the amount of
Amendment 230 #
Motion for a resolution Paragraph 20 Amendment 231 #
Motion for a resolution Paragraph 20 20.
Amendment 232 #
Motion for a resolution Paragraph 20 20.
Amendment 233 #
Motion for a resolution Paragraph 20 20. Supports the creation of an independent EU body to exercise effective oversight of compliance with the applicable rules;
Amendment 234 #
Motion for a resolution Paragraph 20 20. Supports the
Amendment 235 #
Motion for a resolution Paragraph 20 20. Supports the creation of an
Amendment 236 #
Motion for a resolution Paragraph 21 Amendment 237 #
Motion for a resolution Paragraph 21 21. Considers that the transparency reports drawn up by platforms and national competent authorities should be made available to th
Amendment 238 #
Motion for a resolution Paragraph 21 21. Considers that the transparency reports drawn up by platforms and
Amendment 239 #
Motion for a resolution Paragraph 21 21. Considers that the transparency reports drawn up by platforms and national competent authorities should be made available to this EU
Amendment 24 #
Motion for a resolution Recital C C. whereas the amount of user- generated content, including harmful and illegal content, shared via
Amendment 240 #
Motion for a resolution Paragraph 21 21. Considers that the transparency reports drawn up by platforms and national
Amendment 241 #
Motion for a resolution Paragraph 21 21. Considers that the transparency reports drawn up by platforms and national competent authorities should be made available to th
Amendment 242 #
Motion for a resolution Paragraph 22 Amendment 243 #
Motion for a resolution Paragraph 22 Amendment 244 #
Motion for a resolution Paragraph 22 22.
Amendment 245 #
Motion for a resolution Paragraph 22 22. Stresses that this EU body should not take on the role of content moderator, but that it should analyse, upon complaint or on its own initiative, whether and how digital service providers amplify illegal content, for example recommendation engines and optimisation features such as autocomplete and trending; calls for this regulator to have the power to impose proportionate fines or other corrective actions when platforms do not provide sufficient information on their procedures or algorithms in a timely manner;
Amendment 246 #
Motion for a resolution Paragraph 22 22. Stresses that th
Amendment 247 #
Motion for a resolution Paragraph 22 22. Stresses that this EU
Amendment 248 #
Motion for a resolution Paragraph 22 a (new) 22a. Is concerned that the increased use of automated decision making and machine learning for purposes such as identification, prediction of behaviour or targeted advertising leads to exacerbated direct and indirect discrimination based on grounds such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation when using digital services; insists that the Digital Services Act must aim to ensure a high level of transparency as regards the functioning of online services and a digital environment free of discrimination;
Amendment 249 #
Motion for a resolution Paragraph 22 a (new) 22a. Considers it necessary, given the increasing fragmentation of national laws on tackling illegal content, to strengthen cooperation mechanisms between the Member States, including with the support of the Commission and the EU agencies; underlines the importance of such a dialogue, in particular on countries' differing opinions as to whether content is illegal or not and its potential impact;
Amendment 25 #
Motion for a resolution Recital C C. whereas the amount of all types of user-
Amendment 250 #
Motion for a resolution Paragraph 22 a (new) 22a. Welcomes the Commission initiative to create a European Digital Media Observatory to support independent fact-checking services, increase public knowledge on online disinformation and support public authorities in charge of monitoring digital media;
Amendment 251 #
Motion for a resolution Paragraph 22 b (new) 22b. Is concerned that the increased use of automated decision making and machine learning for purposes such as identification, prediction of behaviour or targeted advertising may lead to exacerbated direct and indirect discrimination based on grounds such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation when using digital services; insists that the Digital Services Act must aim to ensure a high level of transparency as regards the functioning of online services and a digital environment free of discrimination;
Amendment 252 #
Motion for a resolution Paragraph 22 b (new) 22b. Considers it necessary also to strengthen marketplace liability by means of a specific liability regime in order to protect consumers from dangerous products, particularly through information and transparency, and to guarantee their rights;
Amendment 253 #
Motion for a resolution Paragraph 23 23. Underlines the importance of empowering users to enforce their own fundamental rights online, including by means of easily accessible complaint procedures,
Amendment 254 #
Motion for a resolution Paragraph 23 23. Underlines the importance of empowering users to enforce their own fundamental rights online, including by means of
Amendment 255 #
Motion for a resolution Paragraph 23 23. Underlines the importance of empowering users to enforce their own fundamental rights online, including by means of easily accessible, impartial, efficient and free complaint procedures, legal remedies, educational measures and awareness-raising on data protection issues;
Amendment 256 #
Motion for a resolution Paragraph 23 a (new) 23a. Underlines that in order to fully enjoy fundamental rights, such as freedom of expression and access to information, developed media literacy skills remain crucial; recalls the fundamental role of media literacy as one of the primary solutions to growing disinformation- and hate speech-related issues, and calls on the Commission and Member States to improve media literacy through support for educational initiatives aimed at both students and professional educators, as well as through targeted awareness-raising campaigns within civil society;
Amendment 257 #
Motion for a resolution Paragraph 23 a (new) 23a. Emphasises the indispensability of agreed standards of essential security in cyberspace in order for digital services to provide their full benefits to citizens; notes therefore the urgent need for Member States to take coordinated action to ensure basic cyber hygiene and to prevent avoidable dangers in cyberspace, including through legislative measures;
Amendment 258 #
Motion for a resolution Paragraph 23 a (new) 23a. Further to empowering users to their fundamental rights online, calls on the Commission to ensure that users have access to diverse and quality content online as a mean towards an informed citizenship; in this respect calls on the Commission to propose safeguards ensuring quality media content is easy to access and easy to find on third-party platforms;
Amendment 259 #
Motion for a resolution Paragraph 23 a (new) 23a. Further calls on the Commission to establish a framework that avoids platforms from exercising a second layer of control over content that is provided under a media service provider’s responsibility and which is subject to specific standards and oversight;
Amendment 26 #
Motion for a resolution Recital C C. whereas the
Amendment 260 #
Motion for a resolution Paragraph 23 a (new) 23a. Requests that digital services providers to the maximum extent possible give their users the possibility to choose which content they want to be presented and in which order.
Amendment 261 #
Motion for a resolution Paragraph 23 b (new) 23b. Stresses that the only way for digital services to achieve their full potential is to enable users to be identified unambiguously in an equivalent manner to offline services; notes that online identification can be improved by enforcing eIDAS Regulation’s cross- border interoperability of electronic identifications across the European Union; reminds that Member States and European institutions have to guarantee that the electronic identifications are secure, enable data minimisation and comply with all other aspects of GDPR;
Amendment 262 #
Motion for a resolution Paragraph 23 b (new) 23b. Further to empowering users to their fundamental rights online, calls on the Commission to ensure that users have access to diverse and quality content online as a mean towards an informed citizenship; in this respect calls on the Commission to propose safeguards ensuring quality media content is easy to access and easy to find on third party platforms;
Amendment 263 #
Motion for a resolution Paragraph 23 b (new) 23b. Further calls on the Commission to establish new rules that proscribe platforms’ practices interfering with media freedom and pluralism, in particular by prohibiting platforms from exercising a second layer of control over content that is provided under a media service provider’s responsibility and which is subject to specific standards and oversight;
Amendment 264 #
Motion for a resolution Paragraph 23 b (new) 23b. Requests, based on the principles above, that the Digital Services Act harmonises and replaces the liability measures laid down in the Digital Single Market Copyright Directive, the Audiovisual Media Services Directive and the Terrorist Content Online Regulation.
Amendment 265 #
Motion for a resolution Paragraph 23 c (new) 23c. Strongly underlines the importance of media pluralism, public service media and independent as well as non-commercial media for citizen’s access to quality information; firmly believes that disinformation and “fake news” can only be curbed by public access to high-quality information and education; underlines the importance of mechanisms that support independent media and public service media;
Amendment 266 #
Motion for a resolution Paragraph 24 a (new) 24a. Recommends the Commission to create a verification system for users of digital services, in order to ensure the protection of personal data and age verification, especially for minors, which shall not be used to track the users cross- site nor used commercially;
Amendment 267 #
Motion for a resolution Paragraph 24 c (new) 24c. Notes unnecessary collection of personal data by digital services at the point of registration for a service, such as date and place of birth, cell phone number and postal address, often caused by the use of single-sign in possibilities; Underlines that the GDPR clearly describes the data minimisation principle, thereby limiting the collected data to only that strictly necessary for the purpose; Asks therefore the Commission to create a single European sign-in system and to introduce an obligation to always also offer a manual sign-in option;
Amendment 27 #
Motion for a resolution Recital C a (new) Ca. whereas the use of personal data for the purposes of individual profiling, and its subsequent repurposing, even when seemingly innocuous data is collected from the digital traces of individuals, can be mined in a way that can generate insights that can enable very intimate personal information to be inferred at a very high level of accuracy, especially when these data are merged with other data sets;
Amendment 28 #
Motion for a resolution Recital C b (new) Cb. whereas social media and other content distribution platforms utilise profiling techniques to target and distribute their content, as well as advertisements; whereas the automated algorithms decide how to handle, prioritise, distribute and delete third-party content on online platforms, including during political and electoral campaigns;
Amendment 29 #
Motion for a resolution Recital C c (new) Cc. whereas the proliferation of disinformation, even propaganda online, has been aided by platforms whose very business model is based on profiting from collection and analysis of user data; whereas consequently promoting spreadable, sensationalist content forms part of their business logic, and pushes them to generate more traffic and ‘clicks’, and, in turn, generate more profiling data and thus more profit;
Amendment 3 #
Motion for a resolution Citation 5 — having regard to Regulation (EU) 2016/679 of the European Parliament and
Amendment 30 #
Motion for a resolution Recital C d (new) Cd. whereas the Cambridge Analytica and Facebook scandals revealed how user data had been used to micro-target certain voters with political advertising, and at times, even with targeted disinformation, therefore showing the danger of opaque data processing operations of online platforms;
Amendment 31 #
Ce. whereas the widespread use of algorithms for content filtering and content removal processes also raises rule of law concerns, questions of legality, legitimacy and proportionality;
Amendment 32 #
Motion for a resolution Recital D D. whereas a small number of mostly non-European large service providers have
Amendment 33 #
Motion for a resolution Recital D D. whereas a small number of mostly non-European service providers have
Amendment 34 #
Motion for a resolution Recital D D. whereas a small number of mostly non-European service providers have significant market power and exert influence on the rights and freedoms of individuals, our societies and democracies; whereas such service providers have to comply with the GDPR when offering services in the Union:
Amendment 35 #
Motion for a resolution Recital D D. whereas a small number of mostly non-European service providers have significant market power and exert influence over suppliers and control how information, services and products are presented, thereby having an impact on the rights and freedoms of individuals, and our societies
Amendment 36 #
Motion for a resolution Recital E E. whereas the political approach to tackle harmful and illegal content online in the EU has mainly focused on voluntary cooperation thus far, but a growing number of Member States are adopting national legislation to address illegal content and provisions to address certain types of content were included in recent sectoral legislation at EU level;
Amendment 37 #
Motion for a resolution Recital E E. whereas the political approach to tackle
Amendment 38 #
Motion for a resolution Recital E E. whereas the poli
Amendment 39 #
Motion for a resolution Recital E E. whereas the
Amendment 4 #
Motion for a resolution Citation 6 a (new) — having regard to Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive)3a, _________________ 3a OJ L 95, 15.4.2010, p. 1–24
Amendment 40 #
Motion for a resolution Recital E a (new) Amendment 41 #
Motion for a resolution Recital E b (new) Eb. whereas online hate speech and disinformation are increasingly being used as tools to increase social polarization, which is in turn exploited for political purposes; whereas combating them is not only relevant to the domain of human rights, but is also a fundamental factor in terms of the defence of the rule of law and democracy in the EU;
Amendment 42 #
Motion for a resolution Recital F Amendment 43 #
Motion for a resolution Recital F F. whereas some forms of harmful content may be legal, yet detrimental to society or democracy
Amendment 44 #
Motion for a resolution Recital F F. whereas some forms of
Amendment 45 #
Motion for a resolution Recital F F. whereas some
Amendment 46 #
Motion for a resolution Recital F F. whereas some forms of
Amendment 47 #
Motion for a resolution Recital G G. whereas a pure self-regulatory approach of platforms does not provide legitimacy or adequate transparency and proper information to public authorities, civil society and users on how platforms address illegal
Amendment 48 #
Motion for a resolution Recital G G. whereas a pure self-regulatory approach of platforms does not provide adequate transparency, accountability and oversight to public authorities, civil society and users on how platforms address illegal
Amendment 49 #
Motion for a resolution Recital G Amendment 5 #
Motion for a resolution Citation 7 c (new) Amendment 50 #
Motion for a resolution Recital G G. whereas a pure self-regulatory approach of platforms
Amendment 51 #
Motion for a resolution Recital G G. whereas a pure self-regulatory approach of platforms does not provide adequate transparency to public authorities, civil society and users on how platforms address illegal a
Amendment 52 #
Motion for a resolution Recital H H. whereas regulatory oversight and supervision
Amendment 53 #
Motion for a resolution Recital I Amendment 54 #
Motion for a resolution Recital I Amendment 55 #
Motion for a resolution Recital I I. whereas the absence of uniform and transparent rules for procedural safeguards across the EU is a key obstacle for persons affected by illegal
Amendment 56 #
Motion for a resolution Recital J J. whereas the lack of robust public data on the prevalence and removal of illegal and harmful content online creates a deficit of accountability, both in the private and public sector; this includes the use and underlying source codes of algorithmic processes and how platforms address the erroneous removal of content;
Amendment 57 #
Motion for a resolution Recital J J. whereas the lack of robust public data on the prevalence and removal of illegal and harmful content online, as well as the lack of proper transparency from internet platforms and services as to the algorithms they use, creates a deficit of accountability;
Amendment 58 #
Motion for a resolution Recital J J. whereas the lack of comparable, robust public data on the prevalence and both court mandated and self-regulatory removal of illegal and harmful content online creates a deficit of transparency and accountability;
Amendment 59 #
Motion for a resolution Recital J J. whereas the lack of robust public data on notices and follow-up by competent authorities about, and data on the prevalence and removal of illegal
Amendment 6 #
Motion for a resolution Citation 8 a (new) — having regard to Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive)1a in view of changing market realities; _________________ 1a OJ L 303, 28.11.2018, p. 69–92
Amendment 60 #
Motion for a resolution Recital J a (new) Ja. whereas persons of colour, persons belonging to or who are perceived to belong to ethnic or linguistic minorities, asylum seekers, migrants, LGBTIQ persons and women often experience high levels of discriminatory hate speech, bullying, threats and scapegoating online and run high risks of experiencing so- called "shit storms";
Amendment 61 #
Motion for a resolution Recital J b (new) Jb. whereas algorithms used for automated decision-making or profiling often reproduce existing discriminatory patterns in society, thereby leading to a high risk of exacerbated discrimination for persons already affected.
Amendment 62 #
Motion for a resolution Recital K Amendment 63 #
Motion for a resolution Recital K Amendment 64 #
Motion for a resolution Recital K K. whereas child sexual exploitation online is shaped by technological developments; whereas the vast amount of child sexual abuse material circulating online poses serious challenges for detection, investigation and, most of all, victim identification efforts; whereas the lockdown resulting from the Covid-19 health crisis has seen a 106% rise in online traffic in child pornography according to Europol1 a; _________________ 1a Catherine de Bolle, Executive Director of Europol, in an exchange of views with Parliament's LIBE Committee on 18 May 2020.
Amendment 65 #
Motion for a resolution Recital K K. whereas child sexual exploitation online is shaped by technological developments, such as the increased use of end-to-end encryption and the dark web; whereas the vast amount of child sexual abuse material circulating online poses serious challenges for detection, investigation and, most of all, victim identification efforts;
Amendment 66 #
Motion for a resolution Recital K K. whereas child sexual exploitation online is one of the forms of illegal content shaped by technological developments; whereas the vast amount of child sexual abuse material circulating online poses serious challenges for detection, investigation and, most of all, victim identification efforts;
Amendment 67 #
Motion for a resolution Recital L Amendment 68 #
Motion for a resolution Recital L L. whereas according to the Court of Justice of the European Union (CJEU), jurisprudence host providers may have
Amendment 69 #
Motion for a resolution Recital L L. whereas a
Amendment 7 #
Motion for a resolution Citation 7 Amendment 70 #
Motion for a resolution Recital L L. whereas according to the Court of
Amendment 71 #
Motion for a resolution Recital L a (new) La. whereas a trusted electronic identification is elementary to ensure secure access to digital services and to carry out electronic transactions in a safer way; whereas currently only 15 Member States have notified an electronic identity scheme for cross-border recognition in the framework of the Regulation (EU) 910/2014;
Amendment 72 #
Motion for a resolution Recital L a (new) La. whereas the internet and internet platforms are still a key location for terrorist groups’ activities, and they are used as a tool for sowing propaganda, recruitment and promotion of their activities;
Amendment 73 #
Motion for a resolution Paragraph -1 (new) -1. Underlines that digital services and their underlying algorithms need to fully respect fundamental rights, especially the protection of privacy and personal data, non-discrimination and the freedom of speech and information, as enshrined in the Treaties and the Charter of Fundamental rights of the European Union;
Amendment 74 #
Motion for a resolution Paragraph -1 (new) -1. Stresses that the reform of the current liability regime for digital service providers must be proportionate, must not disadvantage small and medium sized companies, and must not limit innovation, access to information, and freedom of expression.
Amendment 75 #
Motion for a resolution Paragraph -1 a (new) -1a. Emphasises that the rapid development of digital services requires strong legislation to protect privacy; stresses therefore in this regard that all digital services need to fully respect Union data protection and privacy law, namely Regulation (EU) 2016/679 of the European Parliament and of the Council (GDPR) and Directive (EC) 2002/58 of the European Parliament and of the Council (ePrivacy) currently under revision, and the freedom of expression;
Amendment 76 #
Motion for a resolution Paragraph -1 b (new) -1b. Stresses that in line with the principle of data minimisation established by the General Data Protection Regulation, the Digital Services Act shall require intermediaries to enable the anonymous use of their services and payment for them wherever it is technically possible, as anonymity effectively prevents unauthorized disclosure, identity theft and other forms of abuse of personal data collected online; only where existing legislation requires businesses to communicate their identity, providers of major market places could be obliged to verify their identity, while in other cases the right to use digital services anonymously shall be upheld;
Amendment 77 #
Motion for a resolution Paragraph -1 c (new) -1c. Notes that since the online activities of an individual allow for deep insights into their personality and make it possible to manipulate them, the general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy and the protection of personal data; confirms that users have a right not to be subject to pervasive tracking when using digital services; stresses that in the spirit of the jurisprudence on communications metadata, public authorities shall be given access to a user’s subscriber and metadata only to investigate suspects of serious crime with prior judicial authorisation;
Amendment 78 #
Motion for a resolution Paragraph -1 d (new) -1d. Is concerned that single sign-in services can be used to track users across platforms; therefore opposes the creation of a single Union sign-in system; recommends that providers which support a single sign-in service with a dominant market share should be required to also support at least one open and federated identity system based on a non-proprietary framework;
Amendment 79 #
Motion for a resolution Paragraph -1 e (new) -1e. Stresses that in order to overcome the lock-in effect of centralised networks and to ensure competition and consumer choice, users of dominant social media services and messaging services shall be given a right to cross-platform interaction via open interfaces (interconnectivity); highlights that these users shall be able to interact with users of alternative services, and that the users of alternative services shall be allowed to interact with them;
Amendment 8 #
Motion for a resolution Citation 7 Amendment 80 #
Motion for a resolution Paragraph 1 Amendment 81 #
Motion for a resolution Paragraph 1 1. Stresses that illegal content online
Amendment 82 #
Motion for a resolution Paragraph 1 1. Stresses that illegal content and cyber-enabled crimes, such as child sexual exploitation online, should be tackled with the same rigour as illegal content and behaviour offline;
Amendment 83 #
Motion for a resolution Paragraph 1 1. Stresses that illegal content online should be tackled with the same rigour and based on the same legal principles as illegal content offline;
Amendment 84 #
Motion for a resolution Paragraph 1 a (new) Amendment 85 #
Motion for a resolution Paragraph 1 a (new) 1a. Paragraph -1. Underlines that the modernisation of current e-Commerce rules can inevitably affect fundamental rights, including the protection of privacy and personal data, the freedom of expression and information, equality and non-discrimination, freedom of thought, conscience and religion, freedom of assembly and association, freedom of the arts and sciences, and the right to an effective remedy; therefore urges the Commission to be extremely vigilant in its approach and also integrate international human rights standards into its revision;
Amendment 86 #
Motion for a resolution Paragraph 1 a (new) 1a. Is convinced that it is solely the task of democratically accountable competent public authorities to decide on the legality of content online.
Amendment 87 #
Motion for a resolution Paragraph 1 b (new) 1b. Paragraph -1a. Notes how the current digital ecosystem encourages also problematic behaviour, such as hate speech and disinformation; is concerned how promoting controversial content has become the key to the targeted advertisement-based business models, where sensational and polarising content maximises the screen time of users, generating more profiling data, more advertising hours, and therefore more profits; underlines how this type of a business model can have very intrusive and negative effects, not only on individuals and their fundamental rights, but societies as a whole;
Amendment 88 #
Motion for a resolution Paragraph 1 b (new) 1b. Stresses that digital service providers must only be mandated to take their users’ content offline based on sufficiently substantiated orders by democratically accountable competent public authorities.
Amendment 89 #
Motion for a resolution Paragraph 2 2.
Amendment 9 #
Motion for a resolution Citation 7 a (new) Amendment 90 #
Motion for a resolution Paragraph 2 2. Believes in the clear societal and economic benefits of a functioning digital single market for the EU and its Member States; welcomes these benefits, in particular improved access to information and the strengthening of the freedom of expression; stresses the important obligation to ensure a fair digital ecosystem in which fundamental rights and data protection are respected; calls for a minimum level of intervention based on the principles of necessity and proportionality;
Amendment 91 #
Motion for a resolution Paragraph 2 2. Believes in the clear economic benefits of a functioning digital single market for the EU and its Member States; stresses the important obligation to ensure a fair digital ecosystem in which fundamental rights - including freedom of expression and information, and media freedom and pluralism - and data protection are respected; calls for a minimum level of intervention based on the principles of necessity and proportionality;
Amendment 92 #
Motion for a resolution Paragraph 2 2. Believes in the clear economic benefits of a functioning digital single market for the EU and its Member States; stresses the important obligation to ensure a fair digital ecosystem in which fundamental rights and data protection are respected and in which citizens' online digital security is guaranteed; calls for a
Amendment 93 #
Motion for a resolution Paragraph 2 2. Believes in the
Amendment 94 #
Motion for a resolution Paragraph 2 2. Believes in the clear economic benefits of a functioning digital single market for the EU and its Member States; stresses the important obligation to ensure a fair digital ecosystem in which fundamental rights, including freedom of expression, privacy and data protection, are respected; calls for a minimum level of intervention based on the principles of necessity and proportionality;
Amendment 95 #
Motion for a resolution Paragraph 2 2. Believes in the clear economic benefits of a functioning digital single market for the EU and its Member States; stresses the important obligation to ensure a fair digital ecosystem in which fundamental rights
Amendment 96 #
Motion for a resolution Paragraph 2 2. Believes in the
Amendment 97 #
Motion for a resolution Paragraph 2 a (new) 2a. Underlines that digital services and their underlying algorithms need to fully respect fundamental rights, especially privacy, the protection of personal data, non-discrimination and the freedom of expression and information, as enshrined in the Treaties and the Charter of Fundamental rights of the European Union; Calls therefore on the Commission to implement an obligation of transparency and explainability of algorithms, penalties to enforce such obligations, and the possibility of human intervention, as well as other measures, such as independent audits and specific stress tests to assist and enforce compliance; believes that such independent audits should be conducted annually, in analogy with the financial sector, to examine whether the used data policy, algorithms and checks and balances are in accordance with specified criteria and are supervised by an independent sufficient overseeing authority;
Amendment 98 #
Motion for a resolution Paragraph 2 a (new) 2a. Is convinced that digital service providers must not retain data for law enforcement purposes unless a targeted retention of an individual user’s data is directly ordered by a democratically accountable competent public authority in line with Union law.
Amendment 99 #
Motion for a resolution Paragraph 2 b (new) 2b. Notes that digital services use advanced algorithms, which analyse or predict aspects concerning the user’s personal preferences, interests or behaviour, for profiling; Emphasises that the quality of output of automated decision making algorithms is subject to the quality of used data and the chosen predetermined parameters; Stresses that the use of automated decision making algorithms requires a strong legislative framework which protects privacy and personal data, and together with a duty of care obligation overseeing the legitimate use of the algorithms, that does not apply to content moderation, ensures full compliance; Calls therefore on the Commission to work out a duty of care regime, which has its basis in the e- Commerce Directive, through detailed sectoral guidelines in order to use automated decision making algorithms in compliance with the fundamental rights of protection of personal data and privacy, laid down in the General Data Protection Regulation;
source: 653.762
|
History
(these mark the time of scraping, not the official date of the change)
committees/0/shadows/4/group |
Old
Confederal Group of the European United Left - Nordic Green LeftNew
The Left group in the European Parliament - GUE/NGL |
docs/0/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE650.509New
https://www.europarl.europa.eu/doceo/document/LIBE-PR-650509_EN.html |
docs/1/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE653.762New
https://www.europarl.europa.eu/doceo/document/LIBE-AM-653762_EN.html |
docs/2/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.599&secondRef=02New
https://www.europarl.europa.eu/doceo/document/IMCO-AD-648599_EN.html |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588New
https://www.europarl.europa.eu/doceo/document/CULT-AD-648588_EN.html |
docs/4 |
|
events/0/type |
Old
Committee referral announced in Parliament, 1st reading/single readingNew
Committee referral announced in Parliament |
events/1/type |
Old
Vote in committee, 1st reading/single readingNew
Vote in committee |
events/2/type |
Old
Committee report tabled for plenary, single readingNew
Committee report tabled for plenary |
events/3/docs |
|
events/4 |
|
events/4 |
|
events/5 |
|
procedure/Modified legal basis |
Rules of Procedure EP 159
|
procedure/Other legal basis |
Rules of Procedure EP 159
|
docs/4 |
|
events/3 |
|
events/4 |
|
events/5 |
|
forecasts |
|
procedure/stage_reached |
Old
Awaiting Parliament's voteNew
Procedure completed |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588 |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588&secondRef=02 |
forecasts/1 |
|
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588 |
docs/4 |
|
events/2/summary |
|
forecasts/0/title |
Old
Indicative plenary sitting date, 1st reading/single readingNew
Debate in plenary scheduled |
docs/4 |
|
events/2/docs |
|
events/2 |
|
procedure/stage_reached |
Old
Awaiting committee decisionNew
Awaiting Parliament's vote |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588&secondRef=02 |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588 |
events/1 |
|
forecasts |
|
procedure/Modified legal basis |
Rules of Procedure EP 159
|
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588&secondRef=02 |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.588 |
docs/3 |
|
docs/2 |
|
docs/1/date |
Old
2020-06-22T00:00:00New
2020-06-24T00:00:00 |
docs/1/date |
Old
2020-06-11T00:00:00New
2020-06-22T00:00:00 |
docs/1/docs/0/url |
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE653.762
|
docs/1 |
|
docs/0/docs/0/url |
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE650.509
|
docs |
|
committees/2/rapporteur |
|
committees/1/rapporteur |
|
committees/0 |
|
committees/0 |
|
committees/0 |
|
committees/0 |
|
committees/0/rapporteur |
|
committees/0 |
|
committees/0 |
|
committees/0/rapporteur |
|
committees/2 |
|