Progress: Procedure completed
Role | Committee | Rapporteur | Shadows |
---|---|---|---|
Lead | JURI | WÖLKEN Tiemo ( S&D) | SZÁJER József ( EPP), MELCHIOR Karen ( Renew), BREYER Patrick ( Verts/ALE), LEBRETON Gilles ( ID), DZHAMBAZKI Angel ( ECR), MAUREL Emmanuel ( GUE/NGL) |
Committee Opinion | IMCO | CHARANZOVÁ Dita ( Renew) | Marco CAMPOMENOSI ( ID), Marcel KOLAJA ( Verts/ALE), Edina TÓTH ( PPE), Eugen JURZYCA ( ECR), Maria-Manuel LEITÃO-MARQUES ( S&D) |
Committee Opinion | CULT | KAMMEREVERT Petra ( S&D) | Niklas NIENASS ( Verts/ALE), Alexis GEORGOULIS ( GUE/NGL), Tomasz FRANKOWSKI ( PPE), Dace MELBĀRDE ( ECR) |
Lead committee dossier:
Legal Basis:
RoP 47
Legal Basis:
RoP 47Subjects
Events
The European Parliament adopted by 637 votes to 26, with 28 abstentions, a resolution with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online.
Parliament called on the Commission to present without delay a package of legislative proposals constituting a digital services act with an adequate material, personal and territorial scope, defining the key concepts and including the recommendations made in its resolution.
Digital services act
The new digital services act should include a Regulation establishing contractual rights for content management, setting transparent, fair, binding and uniform standards and procedures for content moderation, and ensuring independent and affordable access to judicial redress.
The requested proposal for a Regulation should apply to platforms that host and manage content accessible to the public on websites or through applications within the Union, irrespective of their place of establishment or registration or their principal place of business.
Content moderation principles
Parliament stressed the need to better protect consumers by providing reliable and transparent information on examples of malpractice, such as misleading claims and scams. It considered that the that the use of targeted advertising must be regulated more strictly in favour of less intrusive forms of advertising that do not require any tracking of user interaction with content and that being shown behavioural advertising should be conditional on users’ freely given, specific, informed and unambiguous consent.
Furthermore, the proposed Regulation should prohibit content moderation practices which are discriminatory, in particular against the most vulnerable persons, and should respect at all times the fundamental rights and freedoms of users, in particular freedom of expression.
Content hosting platforms should:
- be accountable for ensuring that their content management practices are fair, transparent and proportionate;
- provide users with sufficient information on their content curation profiles and the individual criteria according to which content hosting platforms curate content for them, including information as to whether algorithms are used and their objectives;
- provide users with an appropriate degree of influence over the curation of content made visible to them, including the choice of opting out of content curation altogether.
Transparency obligations
The proposed Regulation should oblige digital service providers to take the necessary measures to allow the disclosure of the funding of any interest groups with which the users of the providers’ digital services are associated, thus enabling the person who is legally responsible to be identified.
Commercial digital service providers who are established outside the Union should designate a legal representative for the purposes of user interests within the Union and make the contact information of that representative visible and accessible on their online platforms.
Notice procedures
Content hosting platforms should include in their terms and conditions clear, accessible, intelligible and unambiguous information regarding notice procedures.
Upon a notice being issued, and before any decision on the content has been made, the uploader of the content in question should receive the reason for the notice. All parties concerned should be informed of the decision resulting from a notification. Content hosting platforms should act expeditiously to make unavailable or remove content which is manifestly illegal.
Independent dispute settlement
Parliament recommended that Member States should provide independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse when decisions on content moderation are appealed against.
In order to protect anonymous publications and the general interest, not only the user who uploaded the content that is the subject of a dispute but also a third party, such as an ombudsperson,
with a legitimate interest in acting should be able to challenge content moderation decisions. Users should have the right to take legal action at a later stage.
Reports
Members proposed that the platforms should: (i) submit detailed transparency reports to the European entity on a regular basis, based on a consistent methodology and assessed in the light of performance indicators, including on their content policies and (ii) publish these reports and their content management policies in an accessible manner in a publicly accessible database.
Smart contracts and blockchains
The Commission is called on to assess the development and use of distributed ledger technologies, including blockchain and, in particular, of smart contracts, provide guidance to ensure legal certainty for business and consumers, in particular regarding questions of legality, enforcement of smart contracts in cross border situations, and notarisation requirements where applicable, and make proposals for the appropriate legal framework.
Documents
- Results of vote in Parliament: Results of vote in Parliament
- Decision by Parliament: T9-0273/2020
- Debate in Parliament: Debate in Parliament
- Committee report tabled for plenary, single reading: A9-0177/2020
- Committee report tabled for plenary: A9-0177/2020
- Committee opinion: PE648.593
- Committee opinion: PE648.645
- Amendments tabled in committee: PE652.466
- Amendments tabled in committee: PE652.517
- Committee draft report: PE650.529
- Committee draft report: PE650.529
- Amendments tabled in committee: PE652.466
- Amendments tabled in committee: PE652.517
- Committee opinion: PE648.645
- Committee opinion: PE648.593
- Committee report tabled for plenary, single reading: A9-0177/2020
Votes
A9-0177/2020 - Tiemo Wölken - Am 2 #
A9-0177/2020 - Tiemo Wölken - Am 3 #
A9-0177/2020 - Tiemo Wölken - § 8/1 #
A9-0177/2020 - Tiemo Wölken - § 8/2 #
A9-0177/2020 - Tiemo Wölken - § 8/3 #
A9-0177/2020 - Tiemo Wölken - § 17/1 #
A9-0177/2020 - Tiemo Wölken - § 17/2 #
A9-0177/2020 - Tiemo Wölken - Am 4 #
A9-0177/2020 - Tiemo Wölken - Am 1 #
A9-0177/2020 - Tiemo Wölken - Résolution #
Amendments | Dossier |
615 |
2020/2019(INL)
2020/04/29
CULT
37 amendments...
Amendment 1 #
Draft opinion Paragraph 1 1.
Amendment 10 #
Draft opinion Paragraph 1 b (new) 1 b. Highlights that relevant forthcoming legislative proposals should aim to strengthen the digital single market, and thus must avoid creating new barriers that prevent growth in digital services; stresses that any new obligations on platforms should be proportional to their market share and financial capacity, in order to help even the level playing field and promote competition instead of stifling it;
Amendment 11 #
Draft opinion Paragraph 1 c (new) Amendment 12 #
Draft opinion Paragraph 2 2. Notes that communication always takes place in a given context, which is why automated procedures may support individual decisions on the legality of content, but may under no circumstances replace them; notes that editorial decisions, algorithmic processes and arbitrary content removal by online platforms can have a large impact on freedom of expression and access to information; calls for a safe digital environment promoting diversity of opinion, net neutrality, freedom of speech, and access to information and a balanced approach between individual freedom and fundamental rights;
Amendment 13 #
Draft opinion Paragraph 2 2. Notes that communication always takes place in a given context
Amendment 14 #
Draft opinion Paragraph 2 2.
Amendment 15 #
Draft opinion Paragraph 2 2. Notes that communication always takes place in a given context, which is why automated procedures may support individual decisions on the legality of
Amendment 16 #
Draft opinion Paragraph 2 2. Notes that communication always takes place in a given context,
Amendment 17 #
Draft opinion Paragraph 2 2. Notes that communication always takes place in a given context, which is why automated procedures
Amendment 18 #
Draft opinion Paragraph 2 a (new) 2 a. Calls on the Commission to ensure that platform operators make available transparency reports with information about the number of cases where content was misidentified as illegal or as illegally shared, and that competent authorities make available information about the number of cases where removals lead to the investigation and the prosecution of crimes;
Amendment 19 #
Draft opinion Paragraph 2 a (new) 2 a. Recalls that the continuous spread of illegal content online represents a significant threat for the European cultural and creative sectors and the online marketplace; calls for a strengthened legal framework to ensure that service providers take effective measures to remove illegal content from their services and ensure that such content remains inaccessible after being removed;
Amendment 2 #
Draft opinion Paragraph 1 1. Calls for steps to be taken to especially safeguard the availability and accessibility of content for which editorial responsibility
Amendment 20 #
Draft opinion Paragraph 2 a (new) 2 a. Notes that the fight against disinformation, misinformation and mal- information spreading on media platforms, including social media platforms, requires significant corporate social responsibility, based on trust and transparency, in order to counter propaganda and hate speech undermining the Union principles and values;
Amendment 21 #
Draft opinion Paragraph 2 a (new) 2a. Calls for the use of all technologically feasible means of combating harmful or illegal content on the internet to successfully undergo careful prior constitutional vetting, while at the same time rejecting prior checks on content as disproportionate;
Amendment 22 #
Draft opinion Paragraph 2 a (new) 2 a. Calls for measures and specific provisions within the Digital Services Act to effectively halt the spread of disinformation and misleading information shared via online programs and platforms of digital marketing and distribution;
Amendment 23 #
Draft opinion Paragraph 2 b (new) 2b. Calls, moreover, for any notice and action procedure to comply with minimum formal requirements, making it compulsory to inform the uploader about a notification, the further procedure and possibilities of objection, and making provision for an independent dispute settlement procedure; the addressed provider must examine the merits of the complaint by seriously attempting to ascertain the facts needed in order to weigh up the interests of all the parties involved; a purely formal examination is not sufficient;
Amendment 24 #
Draft opinion Paragraph 2 b (new) 2 b. Notes that findability should be made transparent to the maximum extent possible, assisting citizens to discover means to address their needs and take decisions as informed customers, especially after the COVID-19 outbreak, where the use of online services has increased;
Amendment 25 #
Draft opinion Paragraph 3 3. Points out that in addition to transparency obligations, regulations on the findability of content and restrictions on self-referencing can make a significant contribution to the dissemination of lawful content
Amendment 26 #
Draft opinion Paragraph 3 3. Points out that in addition to transparency obligations,
Amendment 27 #
Draft opinion Paragraph 3 3. Points out that in addition to transparency obligations, regulations on the findability of content and restrictions on self-referencing can make a significant contribution to the dissemination of lawful content
Amendment 28 #
Draft opinion Paragraph 3 3. Points out that in addition to
Amendment 29 #
Draft opinion Paragraph 3 3. Points out that in addition to transparency obligations,
Amendment 3 #
Draft opinion Paragraph 1 1. Calls for steps to be taken to safeguard the availability of content for which editorial responsibility is taken or which is produced by journalists and all other media that are already subject to a generally recognised independent oversight on other platforms or in other services so that their content is not subjected to any further controls, while applying clear rules on platform liability with regards to data privacy, online security, transparency, and the enforcement of fundamental rights;
Amendment 30 #
Draft opinion Paragraph 3 a (new) Amendment 31 #
Draft opinion Paragraph 3 a (new) 3 a. Requests the Commission to consider recent national case law setting 30 minutes as the time span for service providers to take down infringing content and to clarify the notion of “expeditious” with regard to live content;
Amendment 32 #
Draft opinion Paragraph 3 a (new) 3 a. Reiterates that pro-competitive data access systems complementing competition law enforcement should seek to decentralise the data held by data holders whilst maintaining incentives to innovate for the benefit of consumers;
Amendment 33 #
Draft opinion Paragraph 3 a (new) 3 a. Points to the fact that fundamental freedoms, such as freedom of speech, consumer choice and right to privacy, should be at the heart of new rules, with an aim to achieve a level playing field across the whole sector;
Amendment 34 #
Draft opinion Paragraph 3 b (new) 3 b. Stresses the importance of removing current and potential new barriers, restrictions and burdens in the supply of digital services, especially for SMEs and start-ups, while at the same time ensuring responsible, non- discriminatory behaviour of platforms and proportional obligations, whether online or offline;
Amendment 35 #
Draft opinion Paragraph 3 b (new) 3 b. Reiterates that pro-competitive data access systems complementing competition law enforcement should seek to decentralise the data held by data holders, whilst maintaining incentive to innovate for the benefit of consumers;
Amendment 36 #
Draft opinion Paragraph 3 c (new) 3 c. Strongly believes that there is a need to strengthen platform liability, when it comes to illegal and unsafe products, thus re-enforcing the digital single market; recalls that in those cases, platform liability should be fit for its purpose, considering the consumer safeguards in place, which should be observed at all times, and the establishment of concomitant redress measures for retailers and consumers; believes that the system could only function if enforcement authorities have sufficient powers, tools and resources to enforce the provisions and efficiently cooperate for cases with a transnational element;
Amendment 37 #
Draft opinion Paragraph 3 c (new) 3 c. Stresses the need to update, modify, increase the comprehensiveness, clarity, and transparency of Union and national rules, while, at the same time, removing unnecessary and outdated regulatory provisions, rather than adding more regulatory provisions with an aim of reflecting today’s technological advancements;
Amendment 4 #
Draft opinion Paragraph 1 1. Calls for steps to be taken to safeguard the availability of content for which editorial responsibility is taken or which is produced by journalists and all other media that are already subject to a generally recognised independent oversight on other platforms or in other services so
Amendment 5 #
Draft opinion Paragraph 1 1. Calls for steps to be taken to safeguard the availability of lawful content for which editorial responsibility is taken or which is produced by journalists and all other media that are already subject to a generally recognised independent oversight on other platforms or in other services so that their content is not subjected to any further controls;
Amendment 6 #
Draft opinion Paragraph 1 a (new) 1 a. Considers that due to constant and rapid technological progress and the associated development of new products and services, many of which cannot yet be anticipated, any new relevant legislative proposals should as much as possible also be forward looking instead of only concentrating on addressing the immediate challenges related to online platforms and current dominant market players;
Amendment 7 #
Draft opinion Paragraph 1 a (new) 1 a. Emphasises that the products bought from online marketplaces should comply with all the relevant Union safety regulations, as the Digital Services Act should be able to upgrade the liability and safety rules for digital platforms, services and products;
Amendment 8 #
Draft opinion Paragraph 1 a (new) 1 a. Emphasises that content that is legal and legally shared under Union or national law has to stay online and that any removal of such content must not lead to the identification of individual users nor to the processing of personal data;
Amendment 9 #
1 a. Recalls that transparency obligations on media platforms and services operating online should also apply to their ownership and their funding sources;
source: 650.540
2020/05/07
IMCO
89 amendments...
Amendment 1 #
Draft opinion Recital B a (new) Ba. Whereas the e-commerce Directive1a is the legal framework for online services in the internal market that regulates content management; whereas any fragmentation of that framework, which might result from the revision of the e-commerce Directive should be avoided; __________________ 1a Directive2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1)
Amendment 10 #
Draft opinion Paragraph 1 1. Welcomes the “CPC Common Position COVID-19”3 issued by the Commission and the Consumer Protection Cooperation (CPC) authorities of the Member States on the most recent reported scams and unfair practices in relation to the COVID-19 outbreak; stresses the necessity to better protect consumers by providing reliable and transparent information on malpractices, such as misleading claims and scams for products in high demand; calls on all platforms to cooperate with the Commission and the competent authorities of the CPC network and members of the European Consumer Centres (ECC) to better identify illegal practices, take down scams and asks the Commission to constantly review the common guidelines for the placement and/or sale of items and services of a false, misleading or otherwise abusive content for consumers; believes such guidelines should not only seek to apply Union and national consumer law, but to proactively seek to put in place the means to react to the crisis in the market rapidly; __________________ 3European Commission / Consumer Protection Cooperation (CPC) Network, Common Position of CPC Authorities, “Stopping scams and tackling unfair
Amendment 11 #
Draft opinion Paragraph 1 1. Welcomes the “CPC Common Position COVID-19”3 issued by the Commission and the Consumer Protection Cooperation (CPC) authorities of the Member States on the most recent reported scams and unfair practices in relation to the COVID-19 outbreak; calls on all platforms to cooperate with the Commission and the competent authorities to better identify illegal practices, take down scams and asks the Commission to
Amendment 12 #
Draft opinion Paragraph 1 1.
Amendment 13 #
Draft opinion Paragraph 1 1. Welcomes the “CPC Common Position COVID-19”3 issued by the Commission and the Consumer Protection Cooperation (CPC) authorities of the Member States on the most recent reported scams and unfair practices in relation to the COVID-19 outbreak; calls on all platforms to cooperate with the Commission and the competent authorities to better identify illegal practices, take down scams and asks the Commission to
Amendment 14 #
Draft opinion Paragraph 1 1. Welcomes the “CPC Common Position COVID-19”3 issued by the Commission and the Consumer Protection Cooperation (CPC) authorities of the
Amendment 15 #
Draft opinion Paragraph 1 a (new) 1a. Stresses that in view of commercial activities on online market places, self- regulation has proven to be insufficient and, therefore, asks the Commission to introduce strong safeguards and obligations for product safety and consumer protection for commercial activities on online market places, accompanied by a tailored liability regime with proper enforcement mechanisms;
Amendment 16 #
Draft opinion Paragraph 1 a (new) 1a. Calls on the Commission to introduce a notice and action legislation, in order to make the removal of illegal content faster and more efficient; stresses that in order to uphold the right to effective remedy, notice and action procedures shall provide users with the right to appeal decisions honouring removal requests, but also decisions denying such requests;
Amendment 17 #
Draft opinion Paragraph 1 a (new) 1a. Welcomes the measures taken by platforms in relation to the COVID-19 which help to proactively take down misleading ads and ‘miracle products' with unsupported health claims, reinforcing automated and human monitoring of content;
Amendment 18 #
Draft opinion Paragraph 1 b (new) 1b. Recalls that in line with Directive (EU) 2018/18081a (AVMS Directive) ex- ante control measures or upload filtering of content do not comply with Article 15 of the Directive 2000/31/EC; underlines therefore that the future Digital Services Act should prohibit imposing mandatory automated technologies to control content on hosting service providers or other intermediary services; __________________ 1a Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audio-visual media services (Audio- visual Media Services Directive) in view of changing market realities (OJ L 303, 28.11.2018, p. 69).
Amendment 19 #
Draft opinion Paragraph 1 b (new) 1b. Asks the Commission to analyse the effect of self-regulatory measures adopted by platforms on misleading ads;
Amendment 2 #
Draft opinion Recital C C. Whereas the revision of Directive 2005/29/EC1a by Directive (EU) 2019/21611b, and Directives (EU) 2019/7701 and (EU) 2019/7712 on certain aspects concerning contracts for the supply of digital content and digital services and contracts for the sale of goods have only recently been adopted; __________________ 1 Directive (EU) 2019/770 of the European
Amendment 20 #
Draft opinion Paragraph 1 c (new) 1c. Asks the Commission to improve consumer rights in the future Digital Services Act, by introducing safeguards to prevent violations of fundamental rights of users, which are missing from Directive 2000/31/EC; notes that those should include, as a minimum, internal and external dispute mechanisms, and the clearly stated possibility of judicial redress;
Amendment 21 #
Draft opinion Paragraph 1 d (new) 1d. Welcomes efforts to bring transparency to content removal; in order to verify compliance with the rules, underlines that the requirement to publish periodic transparency reports should be mandatory, and should include the number of notices, type of entities notifying content, nature of the content subject of complaint, response time by the intermediary, the number of appeals;
Amendment 22 #
Draft opinion Paragraph 1 e (new) 1e. In order to verify such transparency reports and compliance with legal obligations, and in line with the Council of Europe Recommendation CM/Rec(2018)2, Member States should equally make available, publicly and in a regular manner, comprehensive information on the number, nature and legal basis of content restrictions or disclosures of personal data that they have addressed to intermediaries, including those based on international mutual legal assistance treaties, and on steps taken as a result of those requests;
Amendment 23 #
Draft opinion Paragraph 2 2. Welcomes efforts to bring transparency to advertising online and considers that further clarity and guidance is needed as regards professional diligence and obligations for platforms;
Amendment 24 #
Draft opinion Paragraph 2 2.
Amendment 25 #
Draft opinion Paragraph 2 2.
Amendment 26 #
Draft opinion Paragraph 2 2. Welcomes efforts to bring transparency to advertising online and considers that further clarity
Amendment 27 #
Draft opinion Paragraph 2 2. Welcomes efforts to bring transparency to advertising online and considers that further clarity and
Amendment 28 #
Draft opinion Paragraph 2 2.
Amendment 29 #
Draft opinion Paragraph 2 2. Welcomes efforts to bring transparency to advertising online and considers that further clarity and guidance is needed as regards professional diligence and obligations for platforms; believes that where advertisers and intermediaries are established in a third country, they should designate a
Amendment 3 #
Draft opinion Recital C a (new) Amendment 30 #
Draft opinion Paragraph 2 2. Welcomes efforts to bring transparency and accountability to advertising online and considers that further clarity and guidance is needed as regards professional diligence and obligations for platforms; believes that where advertisers and intermediaries are established in a third country, they should designate a legal representative, established in the Union, who can be held accountable for the content of advertisements, in order to for example allow for consumer redress in the case of false or misleading advertisements;
Amendment 31 #
Draft opinion Paragraph 2 2. Welcomes efforts to bring transparency to advertising online and considers that further clarity and guidance is needed as regards professional diligence and obligations for platforms; believes that
Amendment 32 #
Draft opinion Paragraph 2 2. Welcomes efforts to bring transparency to advertising online and considers that further clarity and guidance is needed as regards professional diligence and obligations for platforms; believes that where
Amendment 33 #
Draft opinion Paragraph 2 a (new) 2a. Considers that the websites of platforms should provide a means for consumers to easily lodge complaints concerning false or misleading third-party advertising on these platforms;
Amendment 34 #
Draft opinion Paragraph 2 b (new) 2b. Is of the opinion that the list of legal representatives established in the EU that can be held responsible for the content of advertisements must be easily accessible on the platforms’ websites;
Amendment 35 #
Draft opinion Paragraph 3 3.
Amendment 36 #
Draft opinion Paragraph 3 3. Asks the Commission to clarify and publish what sanctions or other restrictions those advertisement intermediaries and platforms
Amendment 37 #
Draft opinion Paragraph 3 3. Asks the Commission to clarify what sanctions or other restrictions
Amendment 38 #
Draft opinion Paragraph 3 3. Asks the Commission to clarify what sanctions or other restrictions those advertisement intermediaries and platforms should be subject to if they knowingly accept false or misleading advertisements; believes that online platforms should actively monitor the advertisements shown on their sites as well as fake reviews, in order to ensure they do not profit from false or misleading advertisements, including from influencer marketing content which is not being disclosed as sponsored; believes that platforms should also protect consumers from unsolicited commercial communications; underlines that advertisements for commercial products and services, and advertisements of a political or other nature are different in form and function and therefore should be subject to different guidelines and rules;
Amendment 39 #
Draft opinion Paragraph 3 3. Asks the Commission to clarify what sanctions or other restrictions those advertisement intermediaries and platforms should be subject to if they knowingly accept false or misleading advertisements;
Amendment 4 #
Draft opinion Recital C a (new) Ca. Whereas Regulation (EU) 2017/23941a has a pivotal role in enhancing cooperation amongst national authorities in the field of consumer protection; __________________ 1aRegulation (EU) 2017/2394 of the European Parliament and of the Council of 12 December 2017 on cooperation between national authorities responsible for the enforcement of consumer protection laws and repealing Regulation (EC) No 2006/2004 (OJ L 345, 27.12.2017, p. 1).
Amendment 40 #
Draft opinion Paragraph 3 3. Asks the Commission to clarify what sanctions or other restrictions those advertisement intermediaries and platforms should be subject to if they knowingly accept false or misleading advertisements; believes that online platforms should
Amendment 41 #
Draft opinion Paragraph 3 3. Asks the Commission to clarify what sanctions or other restrictions those advertisement intermediaries and platforms should be subject to if they knowingly accept false or misleading advertisements; believes that online platforms should actively monitor the advertisements shown on their sites, in order to ensure they do not profit from false or misleading advertisements, including from influencer marketing content which is not being disclosed as sponsored; underlines that advertisements for commercial products and services, and advertisements of a political or other nature are different in form and function and therefore should be subject to different
Amendment 42 #
Draft opinion Paragraph 3 3. Asks the Commission to clarify what sanctions or other restrictions those advertisement intermediaries and platforms should be subject to if they
Amendment 43 #
Draft opinion Paragraph 3 a (new) 3a. Underlines that video sharing platforms and social media have the capacity to amplify illegal content; calls on companies to make recommendation algorithms transparent, in order to give consumers and researchers insight into those processes, in particular on the data used, the purpose of the algorithm, personalisation, its outcomes and potential dangers, while respecting the principles of explicability, fairness and responsibility; stresses the need to guarantee and properly implement the right of users to opt in for recommended and personalised services;
Amendment 44 #
Draft opinion Paragraph 3 a (new) 3a. Stresses the need to strengthen the coherence between the existing obligations set out in the e-Commerce Directive and the Directive 2005/29/EC on Unfair Commercial Practices related to the transparency of commercial communications and digital advertising;
Amendment 45 #
Draft opinion Paragraph 3 a (new) 3a. Considers that the lack of transparency in the use of ‘chatbots’ is likely to cause difficulties for certain categories of particularly vulnerable people;
Amendment 46 #
Draft opinion Paragraph 3 b (new) 3b. Underlines the importance of algorithmic transparency for consumer protection, namely by ensuring explainability and auditability of automated decision-making in the context of both advertisement and content moderation;
Amendment 47 #
Draft opinion Paragraph 3 b (new) 3b. Urges the Commission to introduce a requirement for websites and social media accounts to clearly and unequivocally state whether the user is interacting with artificial intelligence algorithms simulating a human conversation;
Amendment 48 #
Draft opinion Paragraph 3 c (new) Amendment 49 #
Draft opinion Paragraph 3 c (new) 3c. Urges the Commission to assess the requirement for influencers to communicate in a clear, intelligible and visible manner at the start of the post whether the influencer was paid, directly or indirectly, or received products free of charge or at a discount for that post;
Amendment 5 #
Draft opinion Recital C a (new) Ca. Whereas, in relation to the COVID-19 outbreak, the Commission welcomed the positive approach by the platforms after sending them the letters on 23 March 2020;
Amendment 50 #
Draft opinion Paragraph 3 d (new) 3d. Calls on the Commission to promote technologies which protect the privacy of users in advertising, with a particular focus on the most vulnerable groups;
Amendment 51 #
Draft opinion Paragraph 4 4. While recalling earlier efforts, asks the Commission to further review the practice of End User Licensing Agreements (EULAs) and to seek ways to
Amendment 52 #
Draft opinion Paragraph 4 4. While recalling earlier efforts, asks the Commission to further review the practice of End User Licensing Agreements (EULAs) and to seek ways to ensure compliance with Union law, in order to allow greater and easier engagement for consumers
Amendment 53 #
Draft opinion Paragraph 4 4. While recalling earlier efforts, asks the Commission to further review the practice of End User Licensing Agreements (EULAs) and to seek ways to allow greater and easier engagement for consumers, including in the choice of clauses in order to allow for a better informed consent; notes that EULAs are often accepted by users without reading them; moreover notes that when a EULA does allow for users to opt-out of clauses, platforms may require users to do so at each use; notes that the majority of EULAs can be unilaterally changed by the platforms without any notice to consumers, with pernicious effects in terms of consumer protection, and calls for a better consumer protection through effective measures;
Amendment 54 #
Draft opinion Paragraph 4 4. While recalling earlier efforts, asks the Commission to further review the practice of End User Licensing Agreements (EULAs) and to seek ways to allow greater and easier engagement for consumers, including in the choice of clauses; notes that
Amendment 55 #
Draft opinion Paragraph 4 4. While recalling earlier efforts, asks the Commission to further review the practice of End User Licensing Agreements (EULAs) and Terms and Conditions Agreements (T&Cs) and to seek ways to allow greater and easier engagement for consumers, including in the choice of clauses; notes that EULAs and T&Cs are often accepted by users without reading them;
Amendment 56 #
Draft opinion Paragraph 4 4. While recalling earlier efforts, asks the Commission to further review the practice of End User Licensing Agreements (EULAs) and to seek ways to allow
Amendment 57 #
Draft opinion Paragraph 4 a (new) 4a. Believes that a summary text of a T&C and EULA, written in plain and clear language, including the option to "opt out" easily from optional clauses, should be displayed at the start of any such agreement; believes that the Commission should establish a template for T&Cs and EULAs summaries;
Amendment 58 #
Draft opinion Paragraph 5 5. Underlines that EULAs should always make the sharing of all data with third parties optional unless vital to the functioning of the services, establishing a high level of data protection and security; recommends that any data access remedy should be imposed only to tackle market failures, be in compliance with the GDPR, give consumers the right to object to data sharing, and provide consumers with technical solutions to help them control and manage flows of their personal information and have means of redress; asks the Commission to ensure that consumers can still use a connected device for all its primary functions even if a consumer withdraws their consent to share non-
Amendment 59 #
Draft opinion Paragraph 5 5. Underlines that EULAs should always make the sharing of all data with third parties optional unless vital to the functioning of the services, establishing a high level of data protection and security; recommends that any data access remedy should be imposed only to tackle market failures, be incompliance with the GDPR, give consumers the right to object to data sharing and provide consumers with technical solutions to help them control and manage flows of their personal information and have means of redress; asks the Commission to ensure that consumers can still use a connected device for all its primary functions even if a consumer withdraws their consent to share non-
Amendment 6 #
Draft opinion Recital C b (new) Cb. Whereas some online service providers (food delivery, driving services, online shops, etc.) require significant physical work to provide their services, but many do not formally employ the persons carrying out this work, regardless of the dependency relationship the persons have with the online service provider, giving rise to potential vulnerable working conditions;
Amendment 60 #
Draft opinion Paragraph 5 5. Underlines that EULAs should always make the sharing of a
Amendment 61 #
Draft opinion Paragraph 5 5. Underlines that EULAs should always make the sharing of all data with third parties optional
Amendment 62 #
Draft opinion Paragraph 5 5. Underlines that
Amendment 63 #
Draft opinion Paragraph 5 5. Underlines that EULAs should always make the sharing of all data with third parties optional unless vital to the functioning of the services; asks the Commission to ensure that, where reasonable, consumers can still use a connected device for all its primary functions even if a consumer withdraws their consent to share non-
Amendment 64 #
Draft opinion Paragraph 5 5. Underlines that EULAs and T&Cs should always make the sharing of all data with third parties optional unless vital to the functioning of the services; asks the Commission to ensure that consumers can still use a connected device for all its primary functions even if a consumer withdraws their consent to share non-
Amendment 65 #
Draft opinion Paragraph 5 5. Underlines that EULAs should always make the sharing of all data with third parties optional unless vital to the functioning of the services; asks the Commission to ensure that consumers can still use a connected device for all its primary functions even if a consumer withdraws their consent to share non- operational data with the device manufacturer or third parties; reiterates the need for transparency in EULAs regarding the possibility and scope of data sharing with third parties;
Amendment 66 #
Draft opinion Paragraph 5 a (new) 5a. Asks the Commission to include in a future Digital Services Act an obligation for interoperability for “digital gate keepers”, in order to restore a level playing field for SMEs, thus enlarging consumer choices and therefore providing more diversity online;
Amendment 67 #
Draft opinion Paragraph 6 6. Underlines that the Directive (EU) 2019/770 and Directive (EU) 2019/771 are still to be properly transposed and implemented;
Amendment 68 #
Draft opinion Paragraph 6 6. Underlines that the Directive (EU) 2019/2161, Directive (EU) 2019/770 and Directive (EU) 2019/771 are still to be properly transposed and implemented; asks the Commission to take this into account before taking additional measures;
Amendment 69 #
Draft opinion Paragraph 6 6. Underlines that the Directive (EU) 2019/770 and Directive (EU) 2019/771 are still to be properly transposed and implemented; asks the Commission to take this into account
Amendment 7 #
Draft opinion Recital C b (new) Cb. Whereas Regulation (EU) 2016/6791a (GDPR) establishes the rules on the processing of personal data and on the protection of personal data; __________________ 1a Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
Amendment 70 #
Draft opinion Paragraph 6 6. Underlines that the Directive (EU) 2019/770 and Directive (EU) 2019/771 are still to be properly transposed and implemented; asks the Commission to take this into account
Amendment 71 #
Draft opinion Paragraph 7 7. Notes the rise of “smart contracts” based on distributed ledger technologies; asks the Commission to analyse if certain aspects of “smart contracts” should be clarified, inter alia putting forward an unequivocal definition of “smart contract” and of “DLT”, and if guidance should be given in order to ensure legal certainty for
Amendment 72 #
Draft opinion Paragraph 7 7. Notes the rise of “smart contracts” based on distributed ledger technologies
Amendment 73 #
Draft opinion Paragraph 7 7. Notes the rise of “smart contracts”, such as those based on distributed ledger technologies; asks the Commission to analyse if certain aspects of “smart contracts” should be clarified and if guidance should be given in order to ensure legal certainty for businesses and consumers; asks especially for the Commission to work to ensure that such contracts
Amendment 74 #
Draft opinion Paragraph 7 7. Notes the rise of “smart contracts” based on distributed ledger technologies; asks the Commission to analyse
Amendment 75 #
Draft opinion Paragraph 7 7. Notes the rise of “smart contracts” based on distributed ledger technologies; asks the Commission to analyse if certain aspects of “smart contracts” should be clarified and if guidance should be given in order to ensure legal certainty for businesses and consumers; asks especially for the Commission to work to ensure that such contracts with consumers
Amendment 76 #
Draft opinion Paragraph 8 Amendment 77 #
Draft opinion Paragraph 8 8. Stresses that any future legislative proposals should seek to
Amendment 78 #
Draft opinion Paragraph 8 8. Stresses that any future legislative proposals should seek to remove current and prevent potentially new barriers in the supply of digital services by online platforms; underlines, at the same time, that new Union obligations on platforms must be proportional and clear in nature in order to avoid unnecessary regulatory burdens or unnecessary restrictions; stresses in this regard the importance of establishing a well-balanced Union approach with the ultimate aim of ensuring responsible and non- discriminatory behaviour of online platforms, in line with the Union values and fundamental rights; underlines the need to prevent gold-plating
Amendment 79 #
Draft opinion Paragraph 8 8. Stresses that any future legislative proposals should seek to remove current and prevent potentially new unjustified barriers in the supply of digital services by online platforms, as well as foster an online environment free of illegal content that would be detrimental to Union consumers and the economy of the Union; underlines, at the same time, that new Union obligations on platforms must be proportional and clear in nature in order to avoid unnecessary regulatory burdens or unnecessary restrictions; underlines the need to prevent gold-plating practices of Union legislation by Member States.
Amendment 8 #
Draft opinion Recital C b (new) Cb. Whereas only relevant data, statistics, analyses and proper enforcement could demonstrate a need for any further measures;
Amendment 80 #
Draft opinion Paragraph 8 8. Stresses that any future legislative proposals should seek to remove current and prevent potentially new barriers, lock- ins and reduced competition in the supply of digital services by online platforms; underlines, at the same time, that new Union obligations on platforms must have the public good at their core, be proportional and clear in nature in order to avoid unnecessary regulatory burdens or unnecessary restrictions; underlines the need to prevent gold-plating practices of Union legislation by Member States.
Amendment 81 #
Draft opinion Paragraph 8 8. Stresses that any future legislative proposals should seek to remove current and prevent potentially new barriers in the supply of digital services by online platforms; underlines, at the same time, that new Union obligations on platforms must be proportional and clear in nature in order to avoid unnecessary regulatory burdens or unnecessary restrictions and be guided by consumer protection and product safety goals; underlines the need to prevent gold-plating practices of Union legislation by Member States.
Amendment 82 #
Draft opinion Paragraph 8 8. Stresses that any future legislative proposals should be evidence-based and should seek to remove current and prevent potentially new barriers in the supply of digital services by online platforms; underlines, at the same time, that new Union obligations on platforms must be proportional and clear in nature in order to avoid unnecessary regulatory burdens or unnecessary restrictions; underlines the need to prevent gold-plating practices of Union legislation by Member States.
Amendment 83 #
Draft opinion Paragraph 8 a (new) 8a. Stresses that any future legislative proposal adapting new commercial and civil rules for commercial entities operating online should combat anti- competitive uses of digital advertising and enable a level playing field market; underlines that providing such a framework is essential to boost innovation and foster growth of SMEs in the Union and start-ups, enabling them to profit from the Digital Single Market;
Amendment 84 #
Draft opinion Paragraph 8 a (new) 8a. Welcomes the Commission’s agreement with collaborative economy platforms allowing Eurostat to publish key data on tourism accommodation, as a first steps; asks the Commission to introduce further information obligations for collaborative economy platforms in line with data protection rules, as this is essential for local authorities, in order to ensure the availability of affordable housing.
Amendment 85 #
Draft opinion Paragraph 8 a (new) 8a. Asks the Commission to undertake a revision of Regulation (EU) No 910/20141a (the eIDAS Regulation) in the light of development of virtual identification technologies, including the use of identification applications, in order to ensure that a Virtual "ID" can be used in the same way as a physical card when consumers buy or pay for products and services; __________________ 1a Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC (OJ L 257, 28.8.2014, p. 73).
Amendment 86 #
Draft opinion Paragraph 8 a (new) 8a. Considers that any Regulation setting up standards and procedures for content moderation should be applicable to any type of content moderation by online platforms (voluntary or at the request of competent authorities) and to content supervision or blocking by competent authorities.
Amendment 87 #
Draft opinion Paragraph 8 b (new) 8b. Considers that the Digital Services Act package should (i) specifically address the shortcomings of current legislation as regards to the rights and social protection of persons working for online service providers, regardless of whether they are formally employed, and (ii) set out mechanisms to ensure compliance by the service providers.
Amendment 88 #
Draft opinion Paragraph 8 b (new) 8b. Asks the Commission to explore the possibility to present, as part of the Digital Services Act Package several proposals, including on contractual rights in the context of supply of digital services, as referred to in recommendations set out in the Annex;
Amendment 89 #
Draft opinion Annex (new) Amendment 9 #
Draft opinion Recital C c (new) Cc. Whereas Directive 2002/58/EC1a ensures that all communications over public networks maintain respect for a high level of data protection and privacy; __________________ 1aDirective 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (OJ L 201, 31.7.2002, p. 37).
source: 650.618
2020/06/05
JURI
489 amendments...
Amendment 1 #
Motion for a resolution Citation 3 a (new) - having regard to Directive 2013/11/EU of the European Parliament and of the Council of 21 May 2013 on alternative dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004, Regulation (EU) No 524/2013 of the European Parliament and of the Council of 21 May 2013 on online dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004 and Directive2009/22/EC (Regulation on consumer ODR) and Directive 2009/22/EC (Directive on consumer ADR), and Directive 2008/52/EC of the European Parliament and of the Council of 21 May 2008 on certain aspects of mediation in civil and commercial matters,
Amendment 10 #
Motion for a resolution Citation 8 - having regard to Article 11 of the Charter of Fundamental Rights of the European Union and Article 10 of the
Amendment 100 #
Motion for a resolution Paragraph 3 3.
Amendment 101 #
Motion for a resolution Paragraph 3 3. Considers that, in order to guarantee independence and impartiality, any final decision on the legality of user- generated content must be made by an independent judiciary and not a private commercial entity;
Amendment 102 #
Motion for a resolution Paragraph 3 3. Considers that any final decision on the legality of user-generated
Amendment 103 #
Motion for a resolution Paragraph 3 3. Considers that
Amendment 104 #
Motion for a resolution Paragraph 3 3. Considers that any final decision on
Amendment 105 #
Motion for a resolution Paragraph 3 3. Considers that following the actions of digital service providers any final decision on the legality of user- generated content must be made by an independent judiciary
Amendment 106 #
Motion for a resolution Paragraph 3 3. Considers that
Amendment 107 #
Motion for a resolution Paragraph 3 a (new) 3a. Considers that ‘online marketplace’ content hosting platforms should be considered active hosts. They must be legally responsible for their decisions on the legality of user-generated content;
Amendment 108 #
Motion for a resolution Paragraph 3 b (new) 3b. Considers that the notification and action system under the electronic commerce directive – which obliges commercial platforms to remove identified illegal content, after notification, including by the right-holders and an evaluation – must be strengthened by having a notification and take down system, so that illegal content already removed can no longer reappear on the platform;
Amendment 109 #
Motion for a resolution Paragraph 4 4. Insists that the regulation must proscribe content moderation practices that are discriminatory, including towards the most vulnerable, and must always respect the fundamental rights and freedoms of citizens, and in particular freedom of expression;
Amendment 11 #
Motion for a resolution Citation 8 a (new) - having regard to the 2007 Lugano Convention and the 1958 New York Convention,
Amendment 110 #
Motion for a resolution Paragraph 4 4. Insists that the regulation must proscribe content moderation practices that are dis
Amendment 111 #
Motion for a resolution Paragraph 4 4. Insists that the regulation must proscribe content moderation practices that are
Amendment 112 #
Motion for a resolution Paragraph 4 4. Insists that the regulation must pro
Amendment 113 #
Motion for a resolution Paragraph 4 4.
Amendment 114 #
Motion for a resolution Paragraph 4 a (new) 4a. Insists that the rules must also proscribe platforms’ practices that interfere with media freedom and pluralism, in particular by prohibiting platforms from exercising a second layer of control over content that is provided under a media service provider’s responsibility and is subject to specific standards and oversight;
Amendment 115 #
Motion for a resolution Paragraph 5 Amendment 116 #
Motion for a resolution Paragraph 5 5.
Amendment 117 #
Motion for a resolution Paragraph 5 5. Recommends the establishment of a European Agency tasked with developing common standards and creating new analysis and revision tools, monitoring and enforcing compliance with contractual rights as regards content management, auditing any algorithms used for automated content moderation and curation, and imposing penalties for non-compliance;
Amendment 118 #
Motion for a resolution Paragraph 5 5. Recommends the establishment of a
Amendment 119 #
Motion for a resolution Paragraph 5 5.
Amendment 12 #
Motion for a resolution Recital A A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that balances central concerns like respect for fundamental rights and other rights of citizens, with the need to support development and economic progress, taking into account the interests of users and all market participants, with particular regard to small businesses, SMEs and start-ups;
Amendment 120 #
Motion for a resolution Paragraph 5 5.
Amendment 121 #
Motion for a resolution Paragraph 5 5. Recommends the
Amendment 122 #
Motion for a resolution Paragraph 5 5. Recommends th
Amendment 123 #
Motion for a resolution Paragraph 5 a (new) 5a. Recalls that currently content moderation at European level is done on the basis of injunctions which have no legal force, and that the Commission only requires platforms to moderate the distribution of hate content or remove of terrorist content; recalls that the power to moderate should be removed from the platforms themselves, and that, as part of the impact assessment, consideration should be given to the best way of entrusting this moderation to a fully independent external body;
Amendment 124 #
Motion for a resolution Paragraph 5 a (new) 5a. Calls for content hosting platforms to evaluate the risk that their content management policies of legal content pose to society e.g. public health, disinformation, and, on the basis of a presentation of reports to the relevant European Agency or European body, have a dialogue with the relevant European Agency or European body and the relevant national authorities biannually;
Amendment 125 #
Motion for a resolution Paragraph 6 6. Suggests that content hosting platforms regularly
Amendment 126 #
Motion for a resolution Paragraph 6 6. Suggests that content hosting platforms regularly submit transparency reports to the European Agency, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; calls for the publication of comprehensive transparency reports, based on a consistent methodology and assessed on the basis of relevant performance indicators; further suggests that content hosting platforms publish their reasoned decisions on removing user-generated content on a publicly accessible database;;
Amendment 127 #
Motion for a resolution Paragraph 6 6. Suggests that content hosting platforms regularly submit transparency reports to the European Agency, concerning the compliance of their terms and conditions with the provisions of the
Amendment 128 #
Motion for a resolution Paragraph 6 6. Suggests that content hosting platforms regularly submit transparency reports to the
Amendment 129 #
Motion for a resolution Paragraph 6 6. Suggests that
Amendment 13 #
Motion for a resolution Recital A A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that balances central concerns
Amendment 130 #
Motion for a resolution Paragraph 6 6. Suggests that content hosting platforms regularly submit transparency reports to the European Agency, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms publish th
Amendment 131 #
Motion for a resolution Paragraph 6 6. Suggests that
Amendment 132 #
Motion for a resolution Paragraph 6 6. Suggests that content hosting platforms regularly submit transparency reports to the European
Amendment 133 #
Motion for a resolution Paragraph 6 6. Suggests that content hosting platforms regularly submit transparency reports to the European
Amendment 134 #
Motion for a resolution Paragraph 6 6. Suggests that content hosting platforms regularly submit transparency reports
Amendment 135 #
Motion for a resolution Paragraph 7 Amendment 136 #
Motion for a resolution Paragraph 7 7. Recommends the establishment of independent dispute settlement bodies
Amendment 137 #
Motion for a resolution Paragraph 7 7. Recommends the appointment or establishment of independent
Amendment 138 #
Motion for a resolution Paragraph 7 7.
Amendment 139 #
Motion for a resolution Paragraph 8 8. Takes the firm position that
Amendment 14 #
Motion for a resolution Recital A A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens,
Amendment 140 #
Motion for a resolution Paragraph 8 8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content
Amendment 141 #
Motion for a resolution Paragraph 8 8. Takes the firm position that the Digital Services Act must
Amendment 142 #
Motion for a resolution Paragraph 8 8. Takes the
Amendment 143 #
Motion for a resolution Paragraph 8 8. Takes the firm position that the Digital Services Act must
Amendment 144 #
Motion for a resolution Paragraph 8 8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the
Amendment 145 #
Motion for a resolution Paragraph 8 8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of
Amendment 146 #
Motion for a resolution Paragraph 8 8. Takes the firm position that the Digital Services Act
Amendment 147 #
Motion for a resolution Paragraph 8 8. Takes the firm position that the Digital Services Act must
Amendment 148 #
Motion for a resolution Paragraph 8 8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the European
Amendment 149 #
Motion for a resolution Paragraph 8 8. Takes the
Amendment 15 #
Motion for a resolution Recital A A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that
Amendment 150 #
Motion for a resolution Paragraph 8 a (new) 8a. Stresses, therefore, that the platforms must be transparent in the processing of algorithms and of the data which train them, and have effective means of moderation, which depend on the models developed by certain international platforms whose economic model is based on maximum extraction of data for immediate reinjection into the advertising services market; it is therefore in the interest both of internet users and the user to require the platforms to be transparent as regards the choice of the tools they prioritise for the processing of algorithms and the accompanying human actions;
Amendment 151 #
Motion for a resolution Paragraph 9 9. Considers that the user-targeted amplification of content based on the views or positions presented in such content is one of the most detrimental practices in the digital society, especially in cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements; Considers in this respect that new rules should, on top of bringing transparency and fairness, secure access to diverse and quality content in today’s digital environment and calls on the Commission to propose safeguards ensuring quality media content is easy to access and easy to find on third party platforms.
Amendment 152 #
Motion for a resolution Paragraph 9 9. Considers that the user-targeted amplification of content based on
Amendment 153 #
Motion for a resolution Paragraph 9 9. Considers that the user-targeted amplification of content based on the views or positions presented in such content is
Amendment 154 #
Motion for a resolution Paragraph 9 9. Considers that the user-targeted amplification of content based on the views or positions presented in such content is
Amendment 155 #
Motion for a resolution Paragraph 9 a (new) 9a. Believes that the Commission must provide the requisite legal certainty as regards certain key definitions, such as the concepts of ‘systemic platform’ and ‘hosting platform’, so as to ensure a harmonised approach at EU level and expedite the removal of illegal content; considers furthermore that there is a need, in this connection, for a clear and precise definition of what is meant by ‘illegal content’;
Amendment 156 #
Motion for a resolution Paragraph 10 Amendment 157 #
Motion for a resolution Paragraph 10 Amendment 158 #
Motion for a resolution Paragraph 10 10. Is of the view that the use of targeted advertising must be regulated more strictly in favour of less intrusive forms of advertising that do not require extensive tracking of user interaction with content and that behavioural advertising should depend on the users´ consent;
Amendment 159 #
Motion for a resolution Paragraph 10 10. Is of the view that the use of targeted advertising must be regulated
Amendment 16 #
Motion for a resolution Recital A A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that
Amendment 160 #
Motion for a resolution Paragraph 10 10. Is of the view that the use of targeted advertising must be regulated more strictly in favour of less intrusive forms of advertising that do not require
Amendment 161 #
Motion for a resolution Paragraph 10 a (new) 10a. Notes however that targeted advertising is currently ruled by the General Data Protection Regulation which as to be properly enforced in the Union before any new legislation in this field would be considered;
Amendment 162 #
Motion for a resolution Paragraph 11 11. Recommends, therefore, that the Digital Services Act set clear boundaries as regards the terms for accumulation of data for the purpose of targeted advertising, especially when data are tracked on third party websites and a phase-out prohibition on personalised advertisements, starting with minors;
Amendment 163 #
Motion for a resolution Paragraph 11 11. Recommends, therefore, that the Digital Services Act
Amendment 164 #
Motion for a resolution Paragraph 11 11. Recommends, therefore, that the
Amendment 165 #
Motion for a resolution Paragraph 11 11. Recommends, therefore, that the Digital Services Act
Amendment 166 #
Motion for a resolution Paragraph 11 a (new) 11a. Stresses that in line with the principle of data minimisation established by the General Data Protection Regulation, the Digital Services Act shall require intermediaries to enable the anonymous use of their services and payment for them wherever it is technically possible, as anonymity effectively prevents unauthorized disclosure, identity theft and other forms of abuse of personal data collected online; only where existing legislation requires businesses to communicate their identity, providers of major market places could be obliged to verify their identity, while in other cases the right to use digital services anonymously shall be upheld;
Amendment 167 #
Motion for a resolution Paragraph 11 a (new) 11a. Recommends, therefore, that the Digital Services Act includes legal provisions preventing systemic platforms to access competitively sensitive third party vendor data in their capacity as a platform and then use that data in their capacity as a vendor to sell products or services in competition with those third parties;
Amendment 168 #
Motion for a resolution Paragraph 11 b (new) 11b. Notes that since the online activities of an individual allow for deep insights into their personality and make it possible to manipulate them, the general and indiscriminate collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy; confirms that users have a right not to be subject to pervasive tracking when using digital services; stresses that in the spirit of the jurisprudence on communications metadata, public authorities shall be given access to a user’s subscriber and metadata only to investigate suspects of serious crime with prior judicial authorisation;
Amendment 169 #
Motion for a resolution Paragraph 11 c (new) 11c. Recommends that providers which support a single sign-on service with a dominant market share should be required to also support at least one open and federated identity system based on a non-proprietary framework;
Amendment 17 #
Motion for a resolution Recital A a (new) Aa. whereas Directive (EU) 2018/18081 has recently updated many of the rules applicable to audiovisual media services, including video-sharing platforms, and must be implemented by Member States by 19 September 2020.
Amendment 170 #
Motion for a resolution Paragraph 12 Amendment 171 #
Motion for a resolution Paragraph 12 12. Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate data sharing with the aim of addressing imbalances in market power; suggests, to this end, to explore options to facilitate the interoperability and portability of data; points out, however, that compulsory data access should not reduce incentives for innovation by the data collection platform and should, where it is necessary, be followed by adequate and appropriate safeguards;
Amendment 172 #
Motion for a resolution Paragraph 12 12. Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate data sharing with the aim of addressing imbalances in market power; suggests, to this end, to explore options to facilitate the interoperability and portability of data; calls for the introduction of rules and procedures which facilitate the sharing, with the relevant supervisory authorities, of the data used by systemic and hosting platforms, and which include content moderation tools and the means to remove illegal content;
Amendment 173 #
Motion for a resolution Paragraph 12 12. Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate
Amendment 174 #
Motion for a resolution Paragraph 12 12. Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate data sharing and increase transparency with the aim of addressing imbalances in market power; suggests, to this end, to explore options to facilitate the interoperability and portability of data;
Amendment 175 #
Motion for a resolution Paragraph 12 12. Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate data sharing with the aim of addressing imbalances in market power; suggests, to this end, to explore options to facilitate the interoperability, interconnectivity and portability of data;
Amendment 176 #
Motion for a resolution Paragraph 12 a (new) 12a. Calls on the Commission to lay down rules to ensure effective data interoperability in order to make content purchased on a platform accessible on any digital tool irrespective of the make;
Amendment 177 #
Motion for a resolution Paragraph 13 13.
Amendment 178 #
Motion for a resolution Paragraph 13 13. Calls for content hosting platforms to give users
Amendment 179 #
Motion for a resolution Paragraph 13 13. Calls for content hosting platforms to give users the choice of whether to consent to the use of targeted advertising based on the user’s prior interaction with content on the same content hosting platform or on third party websites; further calls on the platforms to create an advertising archive that is publicly accessible; further recommends that the platforms cooperate with fact checkers in order to indicate the misinformation present on a platform and possible further steps;
Amendment 18 #
Motion for a resolution Recital A b (new) Ab. whereas Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC, have established new rules for online content-sharing providers and must be implemented by Member States by 7 June 2021.
Amendment 180 #
Motion for a resolution Paragraph 13 13. Calls for content hosting platforms to give users
Amendment 181 #
Motion for a resolution Paragraph 13 13. Calls for content hosting platforms to
Amendment 182 #
Motion for a resolution Paragraph 14 Amendment 183 #
Motion for a resolution Paragraph 14 14. Further calls for users to be guaranteed an appropriate degree of influence over the criteria according to which content is curated and made visible for them; affirms that this should also include the option to opt out from any
Amendment 184 #
Motion for a resolution Paragraph 14 14. Further calls for users to be guaranteed an appropriate degree of influence over the criteria according to which content is curated and made visible for them;
Amendment 185 #
Motion for a resolution Paragraph 14 14. Further calls for users to be guaranteed an appropriate degree of influence over the criteria according to which content is curated and made visible for them, in line with the principle of transparency; affirms that this should also include the option to opt out from any content curation;
Amendment 186 #
Motion for a resolution Paragraph 14 a (new) 14a. Encourages diversity of opinions and beliefs on digital platforms, but considers that freedom of expression does not justify the publication of all content and that measures must be taken to ensure a balance between freedom of expression and the rights of other users; considers that the new legislation should encourage the reporting of abuse by other users;
Amendment 187 #
Motion for a resolution Paragraph 14 a (new) 14a. Underlines the importance for the Digital Services Act to prove legally sound and effective from the point of view of the protection of children in the online environment, whilst ensuring full coordination and avoiding duplication with the General Data Protection Regulation and with Audiovisual Media Services Directive.
Amendment 188 #
Motion for a resolution Paragraph 15 15. Suggests that content hosting platforms publish all sponsor
Amendment 189 #
Motion for a resolution Paragraph 15 15.
Amendment 19 #
Motion for a resolution Recital B B. whereas
Amendment 190 #
Motion for a resolution Paragraph 15 a (new) 15a. Calls on the European Commission to require hosting platforms to verify the identity of those advertisers with whom they have a commercial relationship so that the information they provide with is accurate, ensuring accountability of advertisers in case of promoting illegal content;
Amendment 191 #
Motion for a resolution Paragraph 15 a (new) 15a. Calls on the Commission to request that hosting platforms verify the identity and veracity of the information entered by advertisers, and ensure that this is updated constantly and accurately;
Amendment 192 #
Motion for a resolution Paragraph 15 a (new) 15a. Suggests to create a common understanding on what constitutes false or misleading advertisement;
Amendment 193 #
Motion for a resolution Paragraph 15 b (new) 15b. Calls on the Commission to request that hosting platforms close the accounts or terminate all commercial contracts concluded with advertisers in cases where the promotion of illegal content has been detected, and that they take all the requisite steps to prevent that content reappearing on their platforms;
Amendment 194 #
Motion for a resolution Paragraph 16 Amendment 195 #
Motion for a resolution Paragraph 16 16. Regrets the existing information asymmetry between content hosting platforms and public authorities and calls for a compulsory and streamlined exchange of necessary information;
Amendment 196 #
Motion for a resolution Paragraph 16 16.
Amendment 197 #
Motion for a resolution Paragraph 16 a (new) 16a. Recommends that the Digital Services Act require platforms with significant market power to provide an application programming interface, through which third-party platforms and their users can interoperate with the main functionalities and users of the platform providing the application programming interface, including third-party services designed to enhance and customise the user experience of the platform providing the application programming interface, especially through services that customise privacy settings as well as content curation preferences;
Amendment 198 #
Motion for a resolution Paragraph 16 a (new) 16a. Calls on the Member States to ensure that online service providers comply with the requirements laid down in Article 5 of Directive 2000/31/EC on electronic commerce;
Amendment 199 #
Motion for a resolution Paragraph 16 b (new) 16b. Strongly underlines, on the other hand, that platforms with significant market power providing an application programming interface may not share, retain, monetise or use any of the data they receive from third-party services;
Amendment 2 #
Motion for a resolution Citation 3 a (new) - having regard to Directive (EU) 2019/790 of the European Parliament and of the Council of 17April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC,
Amendment 20 #
Motion for a resolution Recital B B. whereas a number of key civil and commercial law aspects might not have
Amendment 200 #
Motion for a resolution Paragraph 16 c (new) 16c. Stresses that interoperability obligations described above may not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue suspension of the provisions on interoperability;
Amendment 201 #
Motion for a resolution Paragraph 16 d (new) 16d. Recalls that the provisions on interoperability described above must respect all relevant data protection laws; recommends, in this respect, that platforms be required by the Digital Services Act to ensure the technical feasibility of the data portability provisions laid down in Art. 20(2) of the General Data Protection Regulation;
Amendment 202 #
Motion for a resolution Paragraph 16 e (new) 16e. Suggests that content hosting platforms with significant market power providing an application programming interface publicly document all interfaces they make available for the purpose of allowing for the interoperability of services;
Amendment 203 #
Motion for a resolution Subheading 3 Provisions regarding terms and conditions, smart contracts and blockchains
Amendment 204 #
Motion for a resolution Paragraph 17 a (new) 17a. Underlines that the fairness and compliance with fundamental rights standards of terms and conditions imposed by intermediaries to the users of their services shall be subject to judicial review. Terms and conditions unduly restricting users’ fundamental rights, such as the right to privacy and to freedom of expression, shall not be binding;
Amendment 205 #
Motion for a resolution Paragraph 17 17. Calls on the Commission to assess the development and use of distributed ledger technologies, including blockchain and, in particular, of so-called smart contracts, namely the questions of legality and enforcement of smart contracts in cross border situations, and
Amendment 206 #
Motion for a resolution Paragraph 18 18. Strongly recommends that smart contracts include mechanisms
Amendment 207 #
Motion for a resolution Paragraph 18 18.
Amendment 208 #
Motion for a resolution Paragraph 18 a (new) 18a. Stresses the need for blockchain technologies, and 'smart contracts’ in particular, to be utilised in accordance with antitrust rules and requirements, especially those prohibiting cartel agreements or concerted practices;
Amendment 209 #
Motion for a resolution Paragraph 18 a (new) 18a. Calls on the Commission to require ‘online marketplace’ content hosting platforms to prohibit non- identifiable content publishers. They must be able to identify the natural or legal persons who publish on their platform;
Amendment 21 #
Motion for a resolution Recital B a (new) Ba. whereas the constant increase in the supply of products and services through the use of hosting platforms lends itself to situations that have the potential to mislead consumers as to the actual origin of goods and services; and whereas illegal activities connected with the use of digital services cause huge losses for the whole of Europe’s production industry, whose expertise and craft traditions should be better safeguarded by commercial entities operating online;
Amendment 210 #
Motion for a resolution Paragraph 18 b (new) 18b. Calls on the Commission to require ‘online marketplace’ content hosting platforms to close the accounts of users who repeatedly publish illegal content and to take the necessary steps to ensure that such illegal content does not reappear on their platform;
Amendment 211 #
Motion for a resolution Paragraph 18 c (new) 18c. Calls on the Commission to prohibit access to the EU market for ‘online marketplace’ content hosting platforms which: - are unable to identify their users; - do not take all necessary measures to take down illegal content; - do not close the accounts of users who repeatedly publish illegal content;
Amendment 212 #
Motion for a resolution Subheading 5 Amendment 213 #
Motion for a resolution Paragraph 19 Amendment 214 #
Motion for a resolution Paragraph 19 19. Considers that non-negotiable terms and conditions sh
Amendment 215 #
Motion for a resolution Paragraph 19 19. Considers that
Amendment 216 #
Motion for a resolution Paragraph 20 Amendment 217 #
Motion for a resolution Paragraph 20 Amendment 218 #
Motion for a resolution Paragraph 21 Amendment 219 #
Motion for a resolution Paragraph 21 Amendment 22 #
Motion for a resolution Recital B a (new) Ba. whereas digital services are used by the majority of Europeans on a daily basis, but are subject to an increasingly wide set of rules across the EU leading to significant fragmentation on the market and consequently legal uncertainty for European users and services operating cross-borders, combined with lack of regulatory control on key aspects of today's information environment;
Amendment 220 #
Motion for a resolution Paragraph 21 a (new) 21a. Stresses that service providers shall not be required to remove or disable access to information that is legal in their country of origin;
Amendment 222 #
Motion for a resolution Paragraph 21 b (new) 21b. Highlights that, in order to constructively supplement the rules of the e-Commerce Directive and to ensure legal certainty, applicable legislation shall exhaustively and explicitly spell out the duties of digital service providers rather than imposing a general duty of care; highlights that the legal regime for digital providers liability should not depend on uncertain notions such as the ‘active’ or ‘passive’ role of providers;
Amendment 223 #
Motion for a resolution Paragraph 21 d (new) 21d. Stresses that the responsibility for enforcing the law, deciding on the legality of online activities and ordering hosting service providers to remove or disable access to content as soon as possible shall rest with independent judicial authorities; only a hosting service provider that has actual knowledge of illegal content and is aware beyond doubt of its illegal nature shall be subject to content removal obligations;
Amendment 224 #
Motion for a resolution Paragraph 21 e (new) 21e. Underlines that illegal content should be removed where it is hosted, and that access providers shall not be required to block access to content;
Amendment 225 #
Motion for a resolution Paragraph 21 f (new) 21f. Stresses that proportionate sanctions should be applied to violations of the law, which shall not encompass excluding individuals from digital services;
Amendment 226 #
Motion for a resolution Paragraph 22 a (new) 22a. Provisions on the safety of products sold online
Amendment 227 #
Motion for a resolution Paragraph 22 b (new) 22b. Stresses that products bought through online marketplaces should comply with all the relevant EU safety regulations, given that the Digital Services Act should be able to upgrade the liability and safety rules for digital platforms, services and products;
Amendment 228 #
Motion for a resolution Paragraph 22 c (new) 22c. Strongly believes that there is a need to strengthen platform liability for illegal and unsafe products, thus re- enforcing the digital single market; recalls that in such cases platform liability should be fit for purpose, taking into account the consumer safeguards in place, which should be complied with at all times, and the establishment of concomitant redress measures for retailers and consumers; believes that the system can only function if enforcement authorities have sufficient powers, tools and resources to enforce the provisions and cooperate effectively in cases with a transnational element;
Amendment 229 #
Motion for a resolution Paragraph 22 d (new) 22d. Stresses that in view of the commercial activities in online marketplaces, self-regulation has proven to be insufficient and calls, therefore, on the Commission to introduce strong safeguards and obligations with respect to product safety and consumer protection for commercial activities in online marketplaces, accompanied by a tailored liability regime with appropriate enforcement mechanisms;
Amendment 23 #
Motion for a resolution Recital C C. whereas some businesses offering digital services could enjoy, due to strong data-
Amendment 230 #
Motion for a resolution Annex I – part A – introductory part – indent 1 a (new) - The proposal focuses on content moderation and curation, and civil and commercial law rules with respect to digital services. Other aspects, such as regulation of online market places, are not addressed, but should be included in the Regulation on Digital Services Act to be proposed by the European Commission.
Amendment 232 #
Motion for a resolution Annex I – part A – introductory part – indent 4 - The proposal aims to further address the inadmissible and unfair terms and conditions used for the purpose of digital services.
Amendment 233 #
Motion for a resolution Annex I – part A – introductory part – indent 5 - The proposal raises the question regarding aspects of data collection in contravention of fair contractual rights of users, data protection and online confidentiality rules.
Amendment 234 #
Motion for a resolution Annex I – part A – introductory part – indent 6 Amendment 235 #
Motion for a resolution Annex I – part A – introductory part – indent 7 - The proposal
Amendment 236 #
Motion for a resolution Annex I – part A – introductory part – indent 7 a (new) - The proposal seeks to strike a balance between, on the one hand, the protection of users’ fundamental and civil rights and, on the other, the provision of business incentives in this sector, especially for SMEs and start-ups;
Amendment 237 #
Motion for a resolution Annex I – part A – introductory part – indent 8 - The proposal raises the importance of
Amendment 238 #
Motion for a resolution Annex I – part A – part I – introductory part The
Amendment 239 #
Motion for a resolution Annex I – part A – part I – section 1 –introductory part A regulation
Amendment 24 #
Motion for a resolution Recital C C. whereas some businesses offering digital services enjoy, due to strong data- driven network effects, market dominance that enables them to impose their business practices on users and makes it increasingly difficult for other players to compete;
Amendment 240 #
Motion for a resolution Annex I – part A – part I – section 1 –introductory part A regulation ‘on contractual rights
Amendment 241 #
Motion for a resolution Annex I – part A – part I – section 1 –– indent 1 a (new) - It should build upon the home state control principle, by updating its scope in light of the increasing convergence of user protection.
Amendment 242 #
Motion for a resolution Annex I – part A – part I – section 1 –– indent 1 b (new) - It should make a clear distinction between illegal and harmful content when it comes to applying the appropriate policy options.
Amendment 243 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 1 c (new) - It should avoid extending its scope that would conflict with existing sectorial rules already in force such as the Copyright Directive or other existing European law in the media and audio- visual field.
Amendment 244 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 2 - It should provide principles for content moderation
Amendment 245 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 2 - It should provide proportionate, evidence-based principles for content moderation, including as regards discriminatory content moderation practices.
Amendment 246 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 2 a (new) - It should provide a dialogue between major content hosting platforms and the relevant, existing or new, European Agency or European body together with national authorities on the risk management of content management of legal content.
Amendment 247 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 2 a (new) - The involvement of the scientific community should be enhanced so that the interests of the European academic community are taken into account when drafting a legislative act.
Amendment 248 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 3 - It should provide formal and procedural standards for a notice and action system
Amendment 249 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 3 - It should provide formal and procedural standards for a deterrent and a notice
Amendment 25 #
Motion for a resolution Recital C C. whereas some businesses offering digital services enjoy, due to strong data- driven network effects, market dominance that makes it increasingly difficult for other players to compete and difficult for new businesses to even enter the market;
Amendment 250 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 3 - It should provide
Amendment 251 #
Motion for a resolution Annex I – part A – part I – section 1 – indent 3 - It should provide formal and procedural standards for a notice and
Amendment 252 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 3 a (new) - It should assess the use of digital technology instruments for the deterrence of illegal content online
Amendment 253 #
Motion for a resolution Annex I – part A – part I – section 1 – indent 4 - It should provide for an independent dispute settlement mechanism
Amendment 254 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 4 - It should provide rules for an independent dispute settlement mechanism by respecting the national competences of the Member States.
Amendment 255 #
Motion for a resolution Annex I – part A – part I – section 1 –– indent 5 - It should fully respect
Amendment 256 #
Motion for a resolution Annex I – part A – part I – section 1 – indent 5 - It should fully respect Union rules protecting personal data as well as fundamental rights and all applicable legislation.
Amendment 257 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 5 - It should fully respect Union rules
Amendment 258 #
Motion for a resolution Annex I – part A – part I – section 1 –indent 5 a (new) - It should provide rules regarding the responsibility of content hosting platforms for goods sold or advertised on them taking into account supporting activities for SMEs in order to minimize their burden when adapting to this responsibility.
Amendment 259 #
Motion for a resolution Annex I – part A – part I – section 2 Amendment 26 #
Motion for a resolution Recital C C. whereas some businesses offering digital services enjoy, due to strong data- driven network effects, market dominance that makes it increasingly difficult for other players, including start-ups, to compete;
Amendment 260 #
Motion for a resolution Annex I – part A – part I – section 2 – introductory part A
Amendment 261 #
Motion for a resolution Annex I – part A – part I – section 2 – introductory part Amendment 262 #
Motion for a resolution Annex I – part A – part I – section 2 – introductory part A
Amendment 263 #
Motion for a resolution Annex I – part A – part I – section 2 – introductory part Amendment 264 #
Motion for a resolution Annex I – part A – part I – section 2 – introductory part A
Amendment 265 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 1 Amendment 266 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 1 Amendment 267 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 1 - regular
Amendment 268 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 1 - regular auditing of the algorithms employed by content hosting platforms for the purpose of content
Amendment 269 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 1 a (new) - regular monitoring the practice of automated content filtering and curation, and reporting to the EU institutions;
Amendment 27 #
Motion for a resolution Recital C C. whereas some businesses offering digital services could enjoy, due to strong data-
Amendment 270 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 2 Amendment 271 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 2 Amendment 272 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 2 - regular review of the
Amendment 273 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 3 Amendment 274 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 3 - working with content hosting platforms on best practices to meet the transparency and accountability requirements for terms and conditions, as well as best practices in content moderation and implementing notice-and-
Amendment 275 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 3 a (new) - cooperate and coordinate with the national authorities of Member States related to the implementation of the Digital Services Act.
Amendment 276 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 Amendment 277 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 Amendment 278 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – introductory part -
Amendment 279 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – introductory part -
Amendment 28 #
Motion for a resolution Recital C C. whereas some businesses offering digital services enjoy, due to strong data- driven network effects,
Amendment 280 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – introductory part - imposing fines for non-compliance with the Digital Services Act. Fines should be set at up to 4 % of the total worldwide
Amendment 281 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 1 Amendment 282 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 1 Amendment 283 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 1 Amendment 284 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 1 - failure to implement the notice- and-
Amendment 285 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 1 - failure to implement the notice-and- action system as provided for in the Regulation;
Amendment 286 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 1 a (new) - failure to implement any other obligations with regard to content moderation;
Amendment 287 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 2 Amendment 288 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 2 Amendment 289 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 2 Amendment 29 #
Motion for a resolution Recital D Amendment 290 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 2 - failure to provide transparent, accessible, fair and non-discriminatory terms and conditions;
Amendment 291 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 2 - failure to provide fair, transparent, accessible and non-discriminatory terms and conditions;
Amendment 292 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 3 Amendment 293 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 3 Amendment 294 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 3 Amendment 295 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 3 - failure to provide access for the
Amendment 296 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 3 - failure to provide access for the European
Amendment 297 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 3 - failure to provide access for the European
Amendment 298 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 3 - failure to provide access for the European Agency to content
Amendment 299 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 4 Amendment 3 #
Motion for a resolution Citation 5 a (new) - having regard to the commitment of the European Commission President, Ms. Ursula von der Leyen, to upgrade the liability and safety rules for digital platforms, services and products, and complete the Digital Single Market via a Digital Services Act,
Amendment 30 #
Motion for a resolution Recital D D. whereas ex-post competition law enforcement alone cannot effectively address the impact of the market dominance of certain online platforms on fair competition in the digital single market; whereas competition law applied to the digital economy sector needs to be redefined in order to equip the sector with effective means to take into account the market power of digital actors;
Amendment 300 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 4 Amendment 301 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 4 Amendment 302 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 4 - failure to submit transparency reports to the
Amendment 303 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 4 - failure to submit transparency reports to the European
Amendment 304 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 4 - failure to submit transparency reports to the European
Amendment 305 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 5 Amendment 306 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 5 Amendment 307 #
Motion for a resolution Annex I – part A – part I – section 2 – indent 4 – subi. 5 a (new) - publication of studies with proposals for measures aimed at helping increase the competitiveness and growth of micro, small and medium-sized enterprises.
Amendment 308 #
Motion for a resolution Annex I – part A – part I – section 3 –– introductory part The Digital Services Act should contain provisions requiring content hosting platforms to regularly publish and provide transparency reports to the
Amendment 309 #
Motion for a resolution Annex I – part A – part I – section 3 –– introductory part The Digital Services Act should contain provisions requiring content hosting platforms to regularly provide transparency reports to the
Amendment 31 #
Motion for a resolution Recital D D. whereas
Amendment 310 #
Motion for a resolution Annex I – part A – part I – section 3 –– introductory part The Digital Services Act should contain provisions requiring content hosting platforms to regularly provide transparency reports to the
Amendment 311 #
Motion for a resolution Annex I – part A – part I – section 3 –– introductory part The Digital Services Act should contain provisions requiring content hosting platforms to regularly provide transparency reports to the
Amendment 312 #
Motion for a resolution Annex I – part A – part I – section 3 –– introductory part The Digital Services Act should contain
Amendment 313 #
Motion for a resolution Annex I – part A – part I – section 3 –– indent 1 – subi. 1 - the total number of notices received and the action taken accordingly,
Amendment 314 #
Motion for a resolution Annex I – part A – part I – section 3– indent 1 – subi. 1 - the total number of notices received and for which types of content,
Amendment 315 #
Motion for a resolution Annex I – part A – part I – section 3 –– indent 1 – subi. 3 - the total number of removal requests complied with
Amendment 316 #
Motion for a resolution Annex I – part A – part I – section 3 – indent 1 – subi. 8 - information on the enforcement of terms and conditions and information on court decisions ordering the annulment and/or modification of terms and conditions of use considered unfair or illegal by an EU country.
Amendment 317 #
Motion for a resolution Annex I – part A – part I – section 3 –– indent 1 – subi. 8 - information on the enforcement of terms and conditions and information on the court rulings received to remove and/or delete terms and conditions for being considered illegal per Member State.
Amendment 318 #
Motion for a resolution Annex I – part A – part I – section 4 – subparagraph 1 Content hosting platforms should, in addition, publish their decisions on content removal on a publicly accessible database to increase transparency for users.
Amendment 319 #
Motion for a resolution Annex I – part A – part I – section 4 – subparagraph 1 a (new) The Digital Services Act should indicate a set of clear indicators to define thresholds for content hosting platforms to be exempted from certain provisions mentioned in this chapter. Such indicators could include considerations on the content hosting platform such as the size of its network (number of users), its financial strength, access to data, active role in content curation, vertical integration, presence of lock-in effect.
Amendment 32 #
Motion for a resolution Recital D D. whereas ex-post competition law enforcement alone cannot effectively address the impact of the market
Amendment 320 #
Motion for a resolution Annex I – part A – part II – section1 – introductory part Measures regarding content curation, data and online advertisements, including political advertising to achieve politically motivated goals, in breach of fair contractual rights of users should include:
Amendment 321 #
Motion for a resolution Annex I – part A – part II – section 1 – introductory part Measures regarding content curation, data and online advertisements
Amendment 322 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 1 - Measures to limit the data collected by content hosting platforms, based on interactions of users with content hosted on content hosting platforms, for the purpose of completing targeted advertising profiles, in particular by imposing strict conditions for the use of targeted personal advertisements, while in the case of political advertising, the measures should be limited to the requirement of transparency in terms of clearly identifying political advertising, the possibility of identifying its sponsor and the entity in whose favour the advertisement was commissioned, and the obligation to indicate that it is political advertising should fall to its sponsor and be backed up by appropriate enforcement tools.
Amendment 323 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 1 - Measures to limit the data collected by content hosting platforms, based on interactions of users with content hosted on content hosting platforms, for the purpose of completing targeted advertising profiles, in particular by imposing strict conditions for the use of targeted personal advertisements and making the collection of personal data subject to user consent.
Amendment 324 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 1 - Measures to
Amendment 325 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 1 -
Amendment 326 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 1 a (new) - In the context of political advertising, it would be appropriate to address the phenomenon of troll farms (also known as troll factories or web brigades) of anonymous commentators on political events who appear on social networks under a massive number of fake user profiles to manipulate public opinion, and to explore various options to combat this frequently cross-border interference in political competition, for instance promoting the concept of trusted personal profiles and possible synergies with efforts to build a European blockchain-based electronic identity verification service.
Amendment 327 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 2 - Users of content hosting platforms should be informed they are the object of targeted advertising, given access to their profile built by content hosting platforms and the possibility to modify it, and given the choice to opt in or out of receiving targeted advertisements.
Amendment 328 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 2 - Users of content hosting platforms should be given the choice to opt in or out
Amendment 329 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 2 - Users of content hosting platforms
Amendment 33 #
Motion for a resolution Recital D D. whereas ex-post competition law enforcement alone cannot effectively address the impact of the market dominance of certain online platforms
Amendment 330 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 3 – introductory part - Content hosting platforms should make available an archive of sponsor
Amendment 331 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 3 – subi. 1 - whether the advertisement or sponsorship is currently active or inactive,
Amendment 332 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 3 – subi. 2 - the timespan during which the advertisement or sponsorship was active,
Amendment 333 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 3 – subi. 3 - the name and contact details of the
Amendment 334 #
Motion for a resolution Annex I – part A – part II – section 1 – indent 3 – subi. 6 Amendment 335 #
Motion for a resolution Annex I – part A – part II – section 2 Amendment 336 #
Motion for a resolution Annex I – part A – part II – section 2 – introductory part The path to fair implementation of the rights of users as regards inter
Amendment 337 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 Amendment 338 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 - an assessment of the possibility of defining fair contractual conditions to facilitate non-personal data sharing with the aim of addressing imbalances in market power
Amendment 339 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 - an assessment of the possibility of defining fair contractual conditions to facilitate and promote data sharing with the aim of addressing imbalances in market power, in particular through the interoperability and portability of data.
Amendment 34 #
Motion for a resolution Recital D a (new) Da. whereas the platform architecture, with its terms and conditions defined by rights and obligations, has become an intrinsic factor in terms of competition, being linked to the service quality offered to not only business users of online intermediation services, for example, but also end consumers;
Amendment 340 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 a (new) - a requirement for platforms with significant market power to provide an application programming interface, through which third-party platforms and their users can interoperate with the main functionalities and users of the platform providing the application programming interface, including third-party services designed to enhance and customise the user experience of the platform providing the application programming interface, especially through services that customise privacy settings as well as content curation preferences;
Amendment 341 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 a (new) - the ban on imposing a locked proprietary ecosystem for the use of digital products. In order to allow genuine interoperability of data, digital products must be in open format so as to allow users to export to different digital environments.
Amendment 342 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 a (new) - users of dominant social media services and messaging services shall be given a right to cross-platform interaction via open interfaces (interconnectivity); users shall be able to interact with users of alternative services, and that the users of alternative services shall be allowed to interact with them.
Amendment 343 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 b (new) - provisions ensuring that that platforms with significant market power providing an application programming interface may not share, retain, monetise or use any of the data they receive from third-party services;
Amendment 344 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 c (new) - provisions ensuring that the interoperability obligations described above may not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue suspension of the provisions on interoperability;
Amendment 345 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 d (new) - provisions ensuring that platforms be required by the Digital Services Act to ensure the technical feasibility of the data portability provisions laid down in Art. 20(2) of the General Data Protection Regulation;
Amendment 346 #
Motion for a resolution Annex I – part A – part II – section 2 – indent 1 e (new) - provisions ensuring that that content hosting platforms with significant market power providing an application programming interface publicly document all interfaces they make available for the purpose of allowing for the interoperability of services;
Amendment 347 #
Motion for a resolution Annex I – part A – part II – section 3 – indent 1 - measures ensuring that the proper legislative framework is in place for the development and deployment of digital services
Amendment 348 #
Motion for a resolution Annex I – part A – part II – section 3 – indent 2 Amendment 349 #
Motion for a resolution Annex I – part A – part II – section 3 – indent 2 - measures ensuring that smart contracts are fitted with mechanisms that can halt their execution, in particular given private concerns of the weaker party or public concerns related to cartelisation and in respect for the rights of creditors in insolvency and restructuring.
Amendment 35 #
Motion for a resolution Recital E E. whereas content hosting platforms evolved from involving the mere display of content into sophisticated bodies and market players, in particular in the case of social networks that harvest and exploit usage data; whereas users have reasonable grounds to expect fair terms for the usage of such platforms; whereas users, whether private individuals or legal persons, have objective reasons to require fair terms with respect to access, transparency, pricing and conflict resolution;
Amendment 350 #
Motion for a resolution Annex I – part A – part II – section 3 – indent 2 a (new) - measures to ensure equality between the parties in case of smart contracts, taking into account in particular the interest of small businesses and SMEs, for which the Commission should examine possible modalities.
Amendment 351 #
Motion for a resolution Annex I – part A – part II – section 4 Amendment 352 #
Motion for a resolution Annex I – part A – part II – section 4 – indent 1 Amendment 353 #
Motion for a resolution Annex I – part A – part II – section 4 – indent 1 - include the effective enforcement of existing measures ensuring that non- negotiable terms and conditions do not include provisions regulating private international law matters to the detriment of access to justice,
Amendment 354 #
Motion for a resolution Annex I – part A – part II – section 4 – indent 1 - include the effective enforcement of existing measures ensuring that non- negotiable terms and conditions do not include provisions regulating private international law matters to the detriment of access to justice,
Amendment 355 #
Motion for a resolution Annex I – part A – part II – section 4 – indent 1 - include measures ensuring that
Amendment 356 #
Motion for a resolution Annex I – part A – part II – section 4 – indent 2 Amendment 357 #
Motion for a resolution Annex I – part A – part II – section 4 – indent 2 - include measures clarifying private international law rules
Amendment 358 #
Motion for a resolution Annex I – part A – part II – section 4 – indent 3 Amendment 359 #
Motion for a resolution Annex I – part A – part II – section 4– final part Amendment 36 #
Motion for a resolution Recital E E. whereas content hosting platforms evolved from involving the mere display of content into sophisticated bodies and market players, in particular in the case of social networks that harvest and exploit usage data; whereas users have reasonable grounds to expect fair terms for the usage of such platform
Amendment 360 #
Motion for a resolution Annex I – part B – recital 1 (1) The terms and conditions that
Amendment 361 #
Motion for a resolution Annex I – part B – recital 2 (2) The civil law regimes governing the practices of content hosting platforms as regards content moderation are based on certain sector-specific provisions at Union level as well as on laws passed by Member States at national level, and there are notable differences in the obligations imposed by those civil law regimes on content hosting platforms and in their enforcement mechanisms. This has resulted in a fragmented regulatory framework at Union level impeding the development of European businesses in the digital single market.
Amendment 362 #
Motion for a resolution Annex I – part B – recital 4 (4) Given the detrimental effects of the fragmentation of the digital Single Market, the international character of content hosting, the great amount of illegal or harmful uploaded content and the dominant position of a few content hosting platforms located outside the Union, the various issues that arise in respect of content hosting need to be regulated in a manner that entails full harmonisation and therefore by means of a regulation;
Amendment 363 #
Motion for a resolution Annex I – part B – recital 4 (4) Given the detrimental effects of the fragmentation of the digital Single Market, legal uncertainty for consumers, the international character of content hosting, and the dominant position of a few content hosting platforms located outside the Union, the various issues that arise in respect of content hosting need to be regulated in a manner that entails
Amendment 364 #
Motion for a resolution Annex I – part B – recital 4 (4) Given the detrimental effects of the fragmentation of the digital Single Market, the international character of content hosting and the
Amendment 365 #
Motion for a resolution Annex I – part B – recital 5 (5) Concerning relations with users, this Regulation should lay down minimum standards for the transparency and accountability of terms and conditions of content hosting platforms. Terms and conditions should be clear accessible, intelligible and unambiguous and include transparent, binding and uniform standards and procedures for content moderation, which should guarantee accessible and independent recourse to judicial redress.
Amendment 366 #
Motion for a resolution Annex I – part B – recital 5 (5) Concerning relations with users, this Regulation should lay down minimum standards for
Amendment 367 #
Motion for a resolution Annex I – part B – recital 5 (5) Concerning relations with users, this Regulation should lay down minimum standards for the transparency and accountability of terms and conditions of content hosting platforms. Terms and conditions should include transparent, binding and uniform standards and procedures for content moderation, which should guarantee accessible and independent recourse to
Amendment 368 #
Motion for a resolution Annex I – part B – recital 5 (5) Concerning relations with users, this Regulation should lay down minimum standards for the transparency and accountability of terms and conditions of content hosting platforms. Terms and conditions should include transparent
Amendment 369 #
Motion for a resolution Annex I – part B – recital 6 Amendment 37 #
Motion for a resolution Recital E E. whereas content hosting platforms evolved from involving the mere display of content into sophisticated bodies and market players, in particular in the case of social networks that optimise content which harvests and exploits usage data; whereas users have
Amendment 370 #
Motion for a resolution Annex I – part B – recital 6 (6) User-targeted amplification of content based on
Amendment 371 #
Motion for a resolution Annex I – part B – recital 6 a (new) (6a) In order to ensure evaluation of the risks presented by the content amplification, this Regulation establishes a biannual dialogue on content management policies of legal content between major content hosting platforms and the respective, existing or new European Agency, or European body together with relevant national authorities.
Amendment 372 #
Motion for a resolution Annex I – part B – recital 6 a (new) (6a) Recalls that algorithms that decide on the ranking of search results influence individual and social communications and interactions and can be opinion- forming, especially in the case of media contents.
Amendment 373 #
Motion for a resolution Annex I – part B – recital 7 (7) In order to ensure, inter alia, that users can assert their rights they should be given an appropriate degree of influence over the curation of content made visible to them, including the possibility to opt out of any content curation altogether. In particular, users should not be subject to curation without specific consent. Dominant platforms should provide users with an interface to have content curated by software or services of their choice.
Amendment 374 #
Motion for a resolution Annex I – part B – recital 7 (7) In order to ensure, inter alia, that users can assert their rights they should be given an appropriate degree of influence
Amendment 375 #
Motion for a resolution Annex I – part B – recital 7 (7) In order to ensure, inter alia, that users can assert their rights they should be given an appropriate degree of influence over the curation of content made visible to them
Amendment 376 #
Motion for a resolution Annex I – part B – recital 8 a (new) (8a) Far too many goods sold online do not follow safety standards. One way of ensuring that content hosting platforms perform due diligence checks of goods sold by it or through it is to make the platforms jointly and severally responsible together with the primary seller. This would not be unreasonable for the content hosting platforms given that they take a share of the proceeds. Special attention should be paid to enable small and medium sized platforms to perform these checks and any supporting activity such as standardisation should ensure that administrative burdens are kept to a minimum.
Amendment 377 #
Motion for a resolution Annex I – part B – recital 9 Amendment 378 #
Motion for a resolution Annex I – part B – recital 9 (9) This Regulation should not contain provisions forcing content hosting platforms to employ any form of
Amendment 379 #
Motion for a resolution Annex I – part B – recital 9 (9) This Regulation should
Amendment 38 #
Motion for a resolution Recital E a (new) Ea. whereas, in the context of transactions, the online marketplace contains grey areas, as some websites or online marketplaces are used to sell products in violation of the rules applicable in EU countries, and whereas it is therefore important that measures be taken against internet service providers to stop or prevent infringements of intellectual property rights and to ensure consumer safety;
Amendment 380 #
Motion for a resolution Annex I – part B – recital 9 (9) This Regulation should not contain provisions forcing passive content hosting platforms to employ any form of fully automated ex-ante control of content.
Amendment 381 #
Motion for a resolution Annex I – part B – recital 9 (9) Th
Amendment 382 #
Motion for a resolution Annex I – part B – recital 9 (9) This Regulation should
Amendment 383 #
Motion for a resolution Annex I – part B – recital 9 a (new) (9a) This Regulation does not prevent platforms from using an automated content mechanism where necessary and justified, and in particular promotes the use of such mechanism in the case the illegal nature of the content has either been established by a court or it can be easily determined without contextualisation.
Amendment 384 #
Motion for a resolution Annex I – part B – recital 10 (10) This Regulation should also include provisions against
Amendment 385 #
Motion for a resolution Annex I – part B – recital 10 (10) This Regulation should also include provisions against inadmissible discriminatory practices, exploitation or exclusion, for the purposes of content moderation
Amendment 386 #
Motion for a resolution Annex I – part B – recital 11 (11) The right to issue a notice pursuant to this Regulation should remain with any natural or legal person, including public bodies, to which content is provided through a website or application.
Amendment 387 #
Motion for a resolution Annex I – part B – recital 11 (11) The right to issue a notice pursuant to this Regulation should remain with any natural or legal person, including public bodies, to which content is provided through a website or application.
Amendment 388 #
Motion for a resolution Annex I – part B – recital 12 Amendment 389 #
Motion for a resolution Annex I – part B – recital 12 (12) After a notice has been issued, the uploader should be informed by the hosting platform about it and in particular about the reason for the notice and for the action taken, be provided information about the procedure, including about appeal and referral to independent dispute settlement bodies, and about available remedies in the event of false notices. Such information should, however, not be given if the content hosting platform has been informed by public authorities about ongoing law enforcement investigations. In such case, it should be for the relevant authorities to inform the uploader about the issue of a notice, in accordance with applicable rules.
Amendment 39 #
Motion for a resolution Recital E a (new) Ea. whereas social media networks and the collaborative economy are blurring the lines between content and service providers and consumers, supply patterns having spread out horizontally rather than being vertical and linear.
Amendment 390 #
Motion for a resolution Annex I – part B – recital 13 (13) All concerned parties should be informed about a decision as regards a notice. The information provided to concerned parties should also include, apart from the outcome of the decision, at least the reason for the decision
Amendment 391 #
Motion for a resolution Annex I – part B – recital 14 (14) Given the immediate nature of content hosting and the often ephemeral purpose of content uploading, it is necessary to establish independent dispute settlement bodies for the purpose of providing quick and efficient
Amendment 392 #
Motion for a resolution Annex I – part B – recital 15 (15) In order to ensure that users and notifiers to make use of referral to independent dispute settlement bodies as a first step, it must be emphasised that such referral should not preclude any subsequent court action
Amendment 393 #
Motion for a resolution Annex I – part B – recital 15 (15)
Amendment 394 #
Motion for a resolution Annex I – part B – recital 15 (15) In order to ensure that users and notifiers
Amendment 395 #
Motion for a resolution Annex I – part B – recital 15 (15) In order to ensure that users and notifiers to make use of referral to independent dispute settlement bodies as a first step, it must be emphasised that such referral should not preclude any subsequent court action. Given that content hosting platforms which enjoy
Amendment 396 #
Motion for a resolution Annex I – part B – recital 16 (16) Users should have the right to referral to a fair and independent dispute settlement body, as an alternative dispute settlement mechanism, to contest a decision taken by a content hosting platform following a notice concerning content they uploaded. Notifiers should have this right if they would have had legal standing in a civil procedure regarding the content in question.
Amendment 397 #
Motion for a resolution Annex I – part B – recital 17 (17) As regards jurisdiction, the competent independent dispute settlement body should be that located in the Member State in which the content forming the subject of the dispute has been uploaded. For natural persons, it should always be possible to bring complaints to the independent dispute body of their Member States.
Amendment 398 #
Motion for a resolution Annex I – part B – recital 17 (17)
Amendment 399 #
Motion for a resolution Annex I – part B – recital 18 Amendment 4 #
Motion for a resolution Citation 7 a (new) - having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 25 May 2016 on Online Platforms and the Digital Single Market - Opportunities and Challenges for Europe (COM(2016)288),
Amendment 40 #
Motion for a resolution Recital E a (new) Ea. whereas transparency of digital services and content hosting platforms could contribute to increasing the significant form of trust in them by companies and users of these services.
Amendment 400 #
Motion for a resolution Annex I – part B – recital 19 (19) This Regulation should include obligations to report on its implementation and to review it within a reasonable time. For this purpose, the independent dispute settlement bodies established pursuant to this Regulation should submit reports on the number of referrals brought before them,
Amendment 401 #
Motion for a resolution Annex I – part B – recital 20 (20) Since the objective of this Regulation
Amendment 402 #
Motion for a resolution Annex I – part B – recital 21 Amendment 403 #
Motion for a resolution Annex I – part B – recital 21 (21) Action at Union level as set out in this Regulation would be substantially enhanced with the
Amendment 404 #
Motion for a resolution Annex I – part B – recital 21 (21) Action at Union level
Amendment 405 #
Motion for a resolution Annex I – part B – recital 21 (21)
Amendment 406 #
Motion for a resolution Annex I – part B – recital 21 (21) Action at Union level as set out in this Regulation would be substantially enhanced with the establishment
Amendment 407 #
Motion for a resolution Annex I – part B – recital 21 a (new) (21a) This regulation must be based on a thorough impact study of the intended initiative, which will clearly demonstrate the need for new rules for information society service providers. This impact study will need to actively involve industry stakeholders providing information society services, which will be most affected by the potential initiative, and whose practical day-to-day experience will be of the greatest value in the design and evaluation of the impact study itself, and thus in the assessment of the need for the aforementioned initiative.
Amendment 408 #
Motion for a resolution Annex I – part B – recital 21 a (new) (21a) This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter as enshrined in the Treaties, in particular the freedom of expression and information, and the right to an effective remedy and to a fair trial.
Amendment 409 #
Motion for a resolution Annex I – part B – Article 1 – paragraph 1 The purpose of this Regulation is to contribute to the proper functioning of the internal market by laying down rules to
Amendment 41 #
Motion for a resolution Recital F F. whereas content hosting platforms
Amendment 410 #
Motion for a resolution Annex I – part B – Article 2 – paragraph 1 This Regulation applies to the management by content hosting platforms of content that is accessible to the public on websites or through smart phone applications in the Union, irrespective of the place of establishment or registration, or principal place of business of the content hosting platform. It shall not apply to non- commercial content hosting platforms and platforms with less than 100,000 users.
Amendment 411 #
Motion for a resolution Annex I – part B – Article 2 – paragraph 1 This Regulation applies to
Amendment 412 #
Motion for a resolution Annex I – part B – Article 2 – paragraph 1 This Regulation applies to the management by content hosting platforms of content that is accessible on websites or through smart
Amendment 413 #
Motion for a resolution Annex I – part B – Article 2 – paragraph 1 This Regulation applies to the management by content hosting platforms of content that is accessible on websites or through
Amendment 414 #
Motion for a resolution Annex I – part B – Article 3 –point 1 (1) ‘content hosting platform’ means an information society service within the meaning of point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council1 of which the main or one of the main purposes is to allow signed-up or non-signed-up users to upload content for display on a publicly accessible website or application; __________________ 1 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
Amendment 415 #
Motion for a resolution Annex I – part B – Article 3 –point 1 (1) ‘content hosting platform’ means a
Amendment 416 #
Motion for a resolution Annex I – part B – Article 3 –point 1 a (new) (1a) 'Dominant platforms' or 'dominant content hosting platforms' means an information society service with several of the following characteristics: (a) ‘bottleneck power’ – which means the capacity to develop or preserve its user base because of network effects which lock-in a significant part of its users, or its positioning in the downstream market allows it to create economic dependency; (b) a considerable size in the market, measured either by the number of active users or by the annual global turnover of the platform; (c) integration into an business or network environment controlled by its group or parent company, which allows for leveraging market power from one market into an adjacent market; (d) a gatekeeper role for a whole category of content or information; (e) access to large amounts of high quality personal data, either provided by users or inferred about users based on monitoring their online behaviour. Data indispensable for providing and improving a similar service, as well as being difficult to access or replicate by potential competitors;
Amendment 417 #
Motion for a resolution Annex I – part B – Article 3 –point 2 (2)
Amendment 418 #
Motion for a resolution Annex I – part B – Article 3 –point 4 (4) ‘content moderation’ means the practice of monitoring and applying a pre- determined set of rules and guidelines to user-generated content in order to protect public interests and ensure that the content complies with proportionate legal and regulatory requirements, community guidelines and terms and conditions, as well as any resulting measure taken by the platform, such as the swift removal of the content or the deletion or suspension of the user’s account, be it through automated means or human operators;
Amendment 419 #
Motion for a resolution Annex I – part B – Article 3 –point 4 (4) ‘content moderation’ means the practice of monitoring and applying a pre- determined set of rules and guidelines to user-
Amendment 42 #
Motion for a resolution Recital F F. whereas content hosting platforms may determine what content is shown to their users, thereby profoundly influencing
Amendment 420 #
Motion for a resolution Annex I – part B – Article 3 –point 5 (5) ‘content curation’ means the practice of selecting, optimising, prioritising and recommending content based on individual user profiles for the purpose of its display on a website or application;
Amendment 421 #
Motion for a resolution Annex I – part B – Article 3 –point 5 a (new) (5a) 'Sponsorship' means content payed for or placed on behalf of a third party;
Amendment 422 #
Motion for a resolution Annex I – part B – Article 3 – point 6 (6) ‘terms and conditions’ means all terms, conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the
Amendment 423 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 1 1. Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall comply with relevant principles of human rights law and be appropriate, relevant and limited to what is necessary in relation to the purposes for which the content is managed. Content hosting platforms with 100,000 users or more shall conduct assessments of the direct and indirect human rights impact of their current and future content management practices on users and affected parties, and ensure appropriate follow-up to these assessments, including monitoring and evaluating the effectiveness of identified responses.
Amendment 424 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 1 1. Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall be appropriate, relevant and limited to what is necessary in relation to the purposes for which the content is managed, whilst ensuring respect for the principle of accountability.
Amendment 425 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 1 1. Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall be appropriate, proportionate to the type and scale of content, relevant and limited to what is necessary in relation to the purposes for which the content is managed.
Amendment 426 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 1 1. Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall be appropriate, relevant and
Amendment 427 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 1 a (new) 1a. Dominant content hosting platforms shall evaluate the risks of their content management policies.
Amendment 428 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 2 Amendment 429 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 2 2. Users shall not be subjected to
Amendment 43 #
Motion for a resolution Recital F F. whereas content hosting platforms may determine what content is shown to their users, thereby
Amendment 430 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 3 3. Content hosting platforms shall provide the users with sufficient information on their content curation profiles and the individual criteria according to which content hosting platforms curate content for them, including if algorithms are used and their objectives.
Amendment 431 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 4 4. Content hosting platforms shall provide users with an appropriate degree of influence over the curation of content made visible to them, including the choice of opting out of content curation altogether. In particular, users shall not be subject to content curation without their specific prior consent.
Amendment 432 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 4 a (new) 4a. The provisions of paragraph (4) shall not apply where a platform is required under legal or regulatory provisions to save content;
Amendment 433 #
Motion for a resolution Annex I – part B – Article 4 – paragraph 4 a (new) 4a. The accounts of users who repeatedly publish illegal content must be closed.
Amendment 434 #
Motion for a resolution Annex I – part B – Article 4 a (new) Amendment 435 #
Motion for a resolution Annex I – part B – Article 4 a (new) Article 4a Responsibility for goods 1. Any person procuring goods from a content hosting platform or through advertising on a platform shall have the right to pursue remedies against the platform if the person has pursued his or her remedies against the supplier but has failed to obtain the satisfaction to which he or she is entitled according to the law or the contract for the supply of goods. 2. The Commission should publish guidelines in particular for small and medium sized platforms in order to support them coping with their responsibility for goods and to ensure that administrative burdens are kept to a minimum. 3. A platform that has become liable according to this article shall have the right to be indemnified by the supplier.
Amendment 436 #
Motion for a resolution Annex I – part B – Article 4 a (new) Article 4a Structured risk dialogue on content curation As part of a structured risk dialogue with the existing or new European Agency, or European body together with the relevant national authorities, the dominant content hosting platforms shall present a report to the Commission or relevant Agency or European body on their risk management of content curation on their platform and how they mitigate these risks.
Amendment 437 #
Motion for a resolution Annex I – part B – Article 4 b (new) Article 4b Transparency obligation 1. Digital services actively hosting or moderating online content shall take the necessary measures in order to disclose the funding and the power of interest groups behind those using their services so that the person legally responsible and accountable should be identifiable. 2. Digital service providers without a permanent establishment in the EU shall designate a legal representative for user interest within the European Union and make the contact information of this representative visible and accessible on their websites.
Amendment 438 #
Motion for a resolution Annex I – part B – Article 5 – subparagraph 1 Any natural or legal person or public body to which content is provided through a website or application shall have the right to issue a notice pursuant to this Regulation with or without providing personal data.
Amendment 439 #
Motion for a resolution Annex I – part B – Article 5 – subparagraph 1 Any natural or legal person or public body to which content is provided through a website
Amendment 44 #
Motion for a resolution Recital F F. whereas content hosting platforms may determine what content is shown to
Amendment 440 #
Motion for a resolution Annex I – part B – Article 5 – subparagraph 2 Amendment 441 #
Motion for a resolution Annex I – part B – Article 5 – subparagraph 2 Amendment 442 #
Motion for a resolution Annex I – part B – Article 5 – subparagraph 2 A content hosting platform
Amendment 443 #
Motion for a resolution Annex I – part B – Article 6 – introductory part Content hosting platforms shall include in their terms and conditions
Amendment 444 #
Motion for a resolution Annex I – part B – Article 6 –point c (c) the deadline for the content hosting platform to expeditiously treat a notice and take a decision;
Amendment 445 #
Motion for a resolution Annex I – part B – Article 6 –point d (d) the deadline for the content hosting platform to inform both parties about the outcome of the decision
Amendment 446 #
Motion for a resolution Annex I – part B – Article 7 –introductory part A notice regarding content shall be made in writing and shall include at least the following information:
Amendment 447 #
Motion for a resolution Annex I – part B – Article 7 –point a (a) a link to the content in question and, where appropriate, e.g. video, a timestamp;
Amendment 448 #
Motion for a resolution Annex I – part B – Article 8 –introductory part Upon a notice being issued, and before any decision on the content has been made, the uploader of the content in question shall receive the following information and be heard where the uploader has chosen to identify and provide contact details:
Amendment 449 #
Motion for a resolution Annex I – part B – Article 8 –introductory part Amendment 45 #
Motion for a resolution Recital F F. whereas content hosting platforms
Amendment 450 #
Motion for a resolution Annex I – part B – Article 8 –introductory part Upon a notice being issued,
Amendment 451 #
Motion for a resolution Annex I – part B – Article 8 –– point a (a) the reason for the notice and for the action taken;
Amendment 452 #
Motion for a resolution Annex I – part B – Article 9 – point 1 1. Content hosting platforms shall ensure that decisions on notifications are taken without undue delay following the necessary investigations. In the case of notifications from trusted flaggers, content hosting platforms may use shortened procedures.
Amendment 453 #
Motion for a resolution Annex I – part B – Article 9 – point 1 1. Content hosting platforms shall ensure that decisions on notifications are taken by qualified staff without undue delay following the necessary investigations.
Amendment 454 #
Motion for a resolution Annex I – part B – Article 9 – point 1 a (new) 1a. Platforms must have a real-time alert and response mechanism for infringements concerning live content issues.
Amendment 455 #
Motion for a resolution Annex I – part B – Article 9 – point 2 2. Following a notice, content hosting platforms shall decide to remove, take down or make invisible content that was the subject of a notice, if such content does not comply with legal and regulatory requirements, community guidelines or
Amendment 456 #
Motion for a resolution Annex I – part B – Article 9 – point 2 2. Following a notice, content hosting platforms shall decide to remove, take down or make invisible content that was the subject of a notice, if such content does not comply with legal and regulatory requirements, community guidelines or terms and conditions. Content hosting platforms must ensure that such content cannot reappear on their platform.
Amendment 457 #
Motion for a resolution Annex I – part B – Article 9 – point 2 2. Following a notice, content hosting platforms shall decide to remove, take down or make invisible content that was the subject of a notice without delay, if such content does not comply with legal and regulatory requirements, community guidelines or
Amendment 458 #
Motion for a resolution Annex I – part B – Article 9 – point 2 2. Following a notice, content hosting platforms shall decide to remove, take down or
Amendment 459 #
Motion for a resolution Annex I – part B – Article 9 – point 2 2. Following a notice, content hosting platforms shall
Amendment 46 #
Motion for a resolution Recital F F. whereas content hosting platforms may determine what content is shown to
Amendment 460 #
Motion for a resolution Annex I – part B – Article 9 – point 2 a (new) 2a. Content hosting platforms shall put in place measures to ensure that previously notified and removed content is not reuploaded online, through the establishment of a clear stay down obligation.
Amendment 461 #
Motion for a resolution Annex I – part B – Article 10 – introductory part Once a decision has been taken, content hosting platforms shall inform all parties involved in the notice procedure about the outcome of the decision, providing the following information in a clear and simple manner:
Amendment 462 #
Motion for a resolution Annex I – part B – Article 10 – point b Amendment 463 #
Motion for a resolution Annex I – part B – Article 10 – point b (b) whether the decision was made by a human or an algorithm and in the latter case the mechanisms through which human oversight was ensured;
Amendment 464 #
Motion for a resolution Annex I – part B – Article 10– point b (b) whether the decision was made by a human or an algorithm and in the latter case, whether a human review has taken place;
Amendment 465 #
Motion for a resolution Annex I – part B – Article 10 – point c (c) information about the possibility for review as referred to in Article 11
Amendment 466 #
Motion for a resolution Annex I – part B – Article 11 Amendment 467 #
Motion for a resolution Annex I – part B – Article 11 Amendment 47 #
Motion for a resolution Recital F F. whereas content hosting platforms may determine what
Amendment 470 #
Motion for a resolution Annex I – part B – Article 12 – paragraph 1 Amendment 471 #
Motion for a resolution Annex I – part B – Article 12 – paragraph 1 Amendment 472 #
Motion for a resolution Annex I – part B – Article 12 – paragraph 1 Without prejudice to judicial or administrative orders regarding content online, content that has been t
Amendment 473 #
Motion for a resolution Annex I – part B – Article 12 – paragraph 1 Without prejudice to judicial or administrative orders regarding
Amendment 474 #
Motion for a resolution Annex I – part B – Article 12 – paragraph 1 a (new) Digital service providers should act expeditiously to make unavailable or remove illegal content that has been notified to them and make best efforts to prevent future uploads of the same content.
Amendment 475 #
Motion for a resolution Annex I – part B – Article 13 – paragraph 1 1. Member States shall establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse when decisions on content moderation are appealed against. The independent dispute settlement bodies should as a minimum comply with the quality requirements for consumer ADR bodies set down under Directive 2013/11/EU.
Amendment 476 #
Motion for a resolution Annex I – part B – Article 13 – paragraph 1 1. Member States
Amendment 477 #
Motion for a resolution Annex I – part B – Article 13 – paragraph 3 3. The referral of a question regarding content moderation to an independent dispute settlement body shall not preclude a user from being able to have further recourse in the courts unless the dispute has been settled by common agreement.
Amendment 478 #
Motion for a resolution Annex I – part B – Article 13 – paragraph 4 Amendment 479 #
Motion for a resolution Annex I – part B – Article 13 – paragraph 4 4. Content hosting platforms that enjoy a dominant position on the market shall contribute financially to the operating costs of the independent dispute settlement bodies through a dedicated fund. Member States shall ensure these bodies are provided with adequate resources.
Amendment 48 #
Motion for a resolution Recital F F. whereas c
Amendment 480 #
Motion for a resolution Annex I – part B – Article 13 – paragraph 4 4. Content hosting platforms that enjoy
Amendment 481 #
Motion for a resolution Annex I – part B – Article 14 – paragraph 1 1. The uploader as well as non-profit entities with a legitimate interest in defending freedom of expression and information shall have the right to
Amendment 482 #
Motion for a resolution Annex I – part B – Article 14 – paragraph 1 1. The uploader shall have the right to refer a case of content moderation to the competent independent dispute settlement body where the content hosting platform has decided to remove
Amendment 483 #
Motion for a resolution Annex I – part B – Article 14 – paragraph 3 Amendment 484 #
Motion for a resolution Annex I – part B – Article 14 – paragraph 3 3. As regards jurisdiction, the competent independent dispute settlement body shall be that located in the Member State in which the content that is the subject of the dispute has been uploaded. For natural persons, it should always be possible to bring complaints to the independent dispute body of their Member States.
Amendment 485 #
Motion for a resolution Annex I – part B – Article 14 – paragraph 3 a (new) 3a. Both the place where the content has been uploaded and accessed shall be deemed to constitute a ground of jurisdiction
Amendment 486 #
Motion for a resolution Annex I – part B – Article 14 – paragraph 4 4. Where the notifier has the right to refer a case of content moderation to an independent dispute settlement body in accordance with paragraph 2, the notifier may refer the case to the independent dispute settlement body located in the Member State of habitual residence of the notifier or the uploader, if the latter is using the service for non-commercial purposes.
Amendment 487 #
Motion for a resolution Annex I – part B – Article 16 a (new) Article 16a Sanctions Member States shall provide for penalties where a person acting for purposes relating to their trade, business, craft or profession systematically and repeatedly submits wrongful notices. Such penalties shall be effective, proportionate and dissuasive.
Amendment 488 #
Motion for a resolution Annex I – part B – Article 17 A
Amendment 489 #
Motion for a resolution Annex I – part B – Article 18 – paragraph 1 1. Member States shall provide the Commission with all relevant information regarding the implementation and application of this Regulation. On the basis of the information provided and of public consultation, the Commission shall, by ... [three years after entry into force of this Regulation], submit a report to the European Parliament and to the Council on the implementation and application of this Regulation and consider the need for additional measures, including, where appropriate, amendments to this Regulation.
Amendment 49 #
Motion for a resolution Recital G G. whereas upholding the law in the digital world does not only involve effective enforcement of rights
Amendment 490 #
Motion for a resolution Annex I – part B – Article 18 – paragraph 2 – point a (a) the number of disputes referred to the independent dispute settlement bodies and the types of content;
Amendment 5 #
Motion for a resolution Citation 7 b (new) - having regard to the Recommendation of the Commission of 1 March 2018 on measures to effectively tackle illegal content online (C(2018) 1177),
Amendment 50 #
Motion for a resolution Recital G G. whereas upholding the law in the digital world does
Amendment 51 #
Motion for a resolution Recital G G. whereas upholding the law in the digital world
Amendment 52 #
Motion for a resolution Recital G G. whereas upholding the law in the digital world does not only involve effective enforcement of rights, but also, in particular, ensuring access to justice for all; whereas delegation of the taking of decisions regarding the legality of content or of law enforcement powers to private companies c
Amendment 53 #
Motion for a resolution Recital G G. whereas upholding the law in the digital world does not only involve effective enforcement of rights, but also, in particular, ensuring access to justice for all; whereas delegation of the taking of decisions regarding the legality of content or of law enforcement powers to private companies
Amendment 54 #
Motion for a resolution Recital H Amendment 55 #
Motion for a resolution Recital H H. whereas
Amendment 56 #
Motion for a resolution Recital H H. whereas content hosting platforms often employ automated content removal mechanisms
Amendment 57 #
Motion for a resolution Recital H H. whereas content hosting platforms often employ automated content removal mechanisms that in some cases can raise legitimate rule of law concerns, in particular when they are not encouraged by Union laws to employ such mechanisms pro-actively and voluntarily, resulting in content removal
Amendment 58 #
Motion for a resolution Recital H H.
Amendment 59 #
Motion for a resolution Recital H a (new) Ha. whereas freedom of expression is a fundamental right enshrined in the Charter of Fundamental Rights of the European Union, which, however, cannot lead to the expression of hate, racist, anti- Semitic, xenophobic or homophobic content, and whereas appropriate ways and means are needed as a matter of urgency to tackle the extremely serious violations currently taking place;
Amendment 6 #
Motion for a resolution Citation 7 c (new) - having regard to the Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market,
Amendment 60 #
Motion for a resolution Recital H a (new) Ha. whereas automated content removal mechanisms of digital service providers should be proportionate, covering only those justified cases, where the benefits of removing content outweigh the potential disadvantages of keeping content online; whereas these procedures should be also transparent and their terms and conditions should be made known prior to the users would use the service;
Amendment 61 #
Motion for a resolution Recital H a (new) Ha. whereas Article 11 of the Charter also protects the freedom and pluralism of the media, which are increasingly dependent on online platforms to reach their audiences; whereas online platforms should not interfere with media content;
Amendment 62 #
Motion for a resolution Recital I I. whereas the civil law regimes governing content hosting platforms’ practices in content moderation are based on certain sector-specific provisions at
Amendment 63 #
Motion for a resolution Recital I I. whereas the civil law regimes governing content hosting platforms’ practices in content moderation are based on certain sector-specific provisions at Union level as well as on laws passed by Member States at national level, and there are notable differences in the obligations imposed on content hosting platforms and in the enforcement mechanisms of the various civil law regimes; whereas this situation requires an appropriate response at Union level and internationally;
Amendment 64 #
Motion for a resolution Recital I I. whereas the civil law regimes governing content hosting platforms’ practices in content moderation are based on certain sector-specific provisions at Union level as well as on laws passed by Member States at national level, and there are notable differences in the obligations imposed on content hosting platforms and in the enforcement mechanisms of the various civil law regimes;
Amendment 65 #
Motion for a resolution Recital I I. whereas the civil law regimes governing content hosting platforms’ practices in content moderation are based on certain sector-specific provisions at Union
Amendment 66 #
Motion for a resolution Recital J J. whereas the current business model of certain content hosting platforms is to promote content that is likely to attract the attention of users and therefore generate more profiling data in order to offer more effective targeted advertisements and thereby increase profit; whereas this profiling coupled with targeted advertisement often leads to the amplification of content based on addressing emotions, often giving rise to sensation in news feed and recommendation systems, resulting in the possible manipulation of users;
Amendment 67 #
Motion for a resolution Recital J J. whereas the current business model of certain content hosting platforms is to promote content that is likely to attract the attention of users and therefore generate more profiling data in order to offer more effective targeted advertisements and thereby increase profit; whereas this profiling coupled with targeted advertisement
Amendment 68 #
Motion for a resolution Recital L L. whereas the choice of algorithmic logic behind
Amendment 69 #
Motion for a resolution Recital L L. whereas the choice of algorithmic logic behind such recommendation systems, content curation or advertisement placements remains at the discretion of the content hosting platforms
Amendment 7 #
Motion for a resolution Citation 7 d (new) - having regard to the Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down bylaw, regulation or administrative action in Member States concerning the provision of audiovisual media services,
Amendment 70 #
Motion for a resolution Recital L L. whereas the choice of algorithmic logic behind such recommendation systems, content curation or advertisement placements remains at the discretion of the content hosting platforms with little possibility for public oversight, which
Amendment 71 #
Motion for a resolution Recital M M. whereas
Amendment 72 #
Motion for a resolution Recital O O. whereas the terms and conditions of platforms, which are non-negotiable, often indicate both applicable law and competent
Amendment 73 #
Motion for a resolution Recital O O. whereas the terms and conditions of platforms, which are non-negotiable, often indicate both applicable law and competent courts outside the Union, which r
Amendment 74 #
Motion for a resolution Recital P P. whereas access to data
Amendment 75 #
Motion for a resolution Recital P P. whereas access to non-personal data is an important factor in the growth of the digital economy; whereas the interoperability of non-personal data can, by removing lock-in effects, play an important part in ensuring that fair market conditions exist;
Amendment 76 #
Motion for a resolution Recital P a (new) Pa. whereas, in updating the essential aspects of civil and commercial law relating to online business transactions, it is necessary to strike a balance between, on the one hand, the protection of users’ fundamental and civil rights and, on the other, European business incentives in this area, especially for SMEs and start- ups;
Amendment 77 #
Motion for a resolution Recital P a (new) Pa. whereas it is important to assess the possibility of tasking an existing or new European Agency, or European body, with the responsibility of ensuring a harmonised approach across the Union and address the new opportunities and challenges, in particular those of a cross- border nature, arising from ongoing technological developments.
Amendment 78 #
Motion for a resolution Recital P a (new) Pa. whereas the Commission has expressed the concern in its opinion on the French "Avia Law" that content moderation could place a disproportionate burden on companies and users, and lead to an excessive deletion of content, thus undermining freedom of expression.
Amendment 79 #
Motion for a resolution Paragraph 1 1.
Amendment 8 #
Motion for a resolution Citation 7 e (new) - having regard to the Directive2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography,
Amendment 80 #
Motion for a resolution Paragraph 1 1. Requests that the Commission
Amendment 81 #
Motion for a resolution Paragraph 1 1. Requests that the Commission submit without undue delay a set of legislative proposals comprising a Digital Services Act with a wide material, personal and territorial scope
Amendment 82 #
Motion for a resolution Paragraph 1 1. Requests that the Commission submit without undue delay a set of evidence-based and proportionate legislative proposals comprising a Digital Services Act with a wide material, personal and territorial scope, including the recommendations as set out in the Annex to this resolution; considers that, without prejudice to detailed aspects of the future legislative proposals, Article 114 of the Treaty on the Functioning of the European Union should be chosen as the legal basis;
Amendment 83 #
Motion for a resolution Paragraph 1 1. Requests that the Commission submit without undue delay a set of legislative proposals comprising a Digital Services Act with a
Amendment 84 #
Motion for a resolution Paragraph 2 2. Proposes that the Digital Services Act include a regulation that establishes contractual rights as regards content management, lays down transparent, binding and uniform standards and procedures for content moderation, and guarantees
Amendment 85 #
Motion for a resolution Paragraph 2 2. Proposes that the Digital Services Act include a regulation that establishes contractual rights as regards content management, lays down transparent, binding and uniform standards and procedures for content moderation, and guarantees accessible and independent recourse to judicial redress; proposes that, for the purposes of drawing up the legislative act, account be taken in particular of consultations with SMEs and start-ups in the field, so as to ensure the relevant provisions are easy to access and apply to these entities;
Amendment 86 #
Motion for a resolution Paragraph 2 2. Proposes that the Digital Services Act include a regulation that
Amendment 87 #
Motion for a resolution Paragraph 2 2. Proposes that the Digital Services Act
Amendment 88 #
Motion for a resolution Paragraph 2 2. Proposes that the Digital Services Act
Amendment 89 #
Motion for a resolution Paragraph 2 2.
Amendment 9 #
Motion for a resolution Citation 7 f (new) - having regard to the Directive (EU) 2017/541/EU of the European Parliament and of the Council of 15 March 2017 on combating terrorism,
Amendment 90 #
Motion for a resolution Paragraph 2 a (new) 2a. Considers that, in the context of the development of online services and in a globalised digital world, the country of origin principle may be unsuitable for reasons recognised in the case law of the Court of Justice of the European Union, in particular as regards consumer protection and intellectual property. The objectives of these platforms are primarily driven by the search for countries where regulations are less restrictive in a number of areas, whether to do with taxation or in connection with illegal or illicit activities; whereas, as a result, it would certainly be useful, in sectors where it is not already established, to apply instead the principle of the country of destination, which would make it possible in future to remedy certain shortcomings in the principle of the law of the country of origin;
Amendment 91 #
Motion for a resolution Paragraph 2 a (new) 2a. Proposes that the Digital Services Act sets up clear rules for the responsibility of content hosting platforms for goods sold or advertised on them in order to close the legal gab in which the user failed to obtain the satisfaction to which he or she is entitled according to the law or the contract for the supply of goods for example because of the inability to identify the primary seller;
Amendment 92 #
Motion for a resolution Paragraph 2 a (new) 2a. With regard to platform responsibility for content, new technological solutions regarding matters of responsibility, identity and anonymity should be found and deployed.
Amendment 93 #
Motion for a resolution Paragraph 2 a (new) 2a. Proposes that the Digital Services Act follow a sector and problem-specific approach and make a clear distinction between illegal and harmful content when elaborating the appropriate policy options;
Amendment 94 #
Motion for a resolution Paragraph 2 a (new) 2a. Requests the Commission that the regulation includes a universal definition of ''dominant platforms'' and lay down its characteristics.
Amendment 95 #
Motion for a resolution Paragraph 2 b (new) 2b. Notes that transparency requirements must be applied to certain platforms in order to ensure that their operation in a closed system does not affect consumer choice, influence their behaviour or constitute a barrier to the freedoms of opinion or expression; stresses that in the case of an online trading platform, the use of any identical product or service or a distinctive sign similar to a recognised trademark poses a risk of confusion on the part of the public and damage to the trademark itself; when the service provider becomes aware of such a risk, it must withdraw, or make it impossible to access, the information or the product as soon as possible;
Amendment 96 #
Motion for a resolution Paragraph 2 b (new) 2b. Underlines that any new framework established in the Digital Services Act should be manageable for small businesses, SMEs and start-ups and should therefore include proportionate obligations and clear safeguards for all sectors;
Amendment 97 #
Motion for a resolution Paragraph 2 c (new) 2c. Proposes that the Digital Services Act introduces enhanced transparency rules for social media platforms in order to disclose the funding and the power of interest groups behind those using the digital services in order to show who is legally responsible for the content;
Amendment 98 #
Motion for a resolution Paragraph 2 d (new) 2d. Proposes that the Digital Services Act set the obligation for digital service providers without a permanent establishment in the EU to designate a legal representative for the interest of users within the European Union and to make the contact information of this representative visible and accessible on its website;
Amendment 99 #
Motion for a resolution Paragraph 2 e (new) 2e. Underlines the importance that online platforms hosting or moderating content online should bear more responsibility for the content they host and should act in order to proactively prevent illegality;
source: 652.466
|
History
(these mark the time of scraping, not the official date of the change)
committees/0 |
|
committees/0 |
|
docs/0/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE650.529New
https://www.europarl.europa.eu/doceo/document/JURI-PR-650529_EN.html |
docs/1/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE652.466New
https://www.europarl.europa.eu/doceo/document/JURI-AM-652466_EN.html |
docs/2/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE652.517New
https://www.europarl.europa.eu/doceo/document/JURI-AM-652517_EN.html |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645New
https://www.europarl.europa.eu/doceo/document/IMCO-AD-648645_EN.html |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593New
https://www.europarl.europa.eu/doceo/document/CULT-AD-648593_EN.html |
docs/6 |
|
events/0/type |
Old
Committee referral announced in Parliament, 1st reading/single readingNew
Committee referral announced in Parliament |
events/2/type |
Old
Vote in committee, 1st reading/single readingNew
Vote in committee |
events/3/type |
Old
Committee report tabled for plenary, single readingNew
Committee report tabled for plenary |
events/4/docs |
|
events/5 |
|
events/5 |
|
events/6 |
|
procedure/Modified legal basis |
Rules of Procedure EP 159
|
procedure/Other legal basis |
Rules of Procedure EP 159
|
docs/6 |
|
events/4 |
|
events/5 |
|
events/6 |
|
forecasts |
|
procedure/stage_reached |
Old
Awaiting Parliament's voteNew
Procedure completed |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593 |
forecasts/1 |
|
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02 |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645 |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593 |
forecasts/0/title |
Old
Indicative plenary sitting date, 1st reading/single readingNew
Debate in plenary scheduled |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645&secondRef=02 |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02 |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645 |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593 |
docs/5 |
|
events/3/docs |
|
events/2 |
|
events/3 |
|
procedure/Modified legal basis |
Rules of Procedure EP 159
|
procedure/stage_reached |
Old
Awaiting committee decisionNew
Awaiting Parliament's vote |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645&secondRef=02 |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02 |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645 |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593 |
forecasts |
|
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02 |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593 |
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645&secondRef=02 |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02 |
docs/4/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.593 |
docs/4 |
|
docs/3/docs/0/url |
Old
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645&secondRef=02New
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE648.645 |
docs/3 |
|
docs/1/docs/0/url |
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE652.466
|
docs/2/docs/0/url |
https://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE652.517
|
docs/1 |
|
docs/2 |
|
committees/0 |
|
committees/0 |
|
docs |
|
committees/2/rapporteur |
|
committees/0/shadows/1 |
|
committees/1/rapporteur |
|
committees/0/shadows/1 |
|
committees/0/shadows/1 |
|
committees/2 |
|