99 Amendments of Adam BIELAN related to 2020/2018(INL)
Amendment 4 #
Motion for a resolution
Citation 2 a (new)
Citation 2 a (new)
- having regard to the European Parliament resolution of 12 December 2018 on the single market package (2018/2903(RSP),
Amendment 7 #
Motion for a resolution
Citation 3 a (new)
Citation 3 a (new)
- having regard to the Communication from the Commission of 10 March 2020, entitled “An SME Strategy for a sustainable and digital Europe” (COM/2020/103),
Amendment 9 #
Motion for a resolution
Citation 4 a (new)
Citation 4 a (new)
- having regard to the commitments made by the Commission in its “Political Guidelines for the next European Commission 2019-2024" and before the European Parliament on 10 September 2019,
Amendment 13 #
Motion for a resolution
Citation 7 a (new)
Citation 7 a (new)
- having regard to the European Parliament resolution of 15 June 2017 on online platforms and the digital single market (2016/2276(INI),1a __________________ 1a OJ C 331, 18.9.2018, p. 135
Amendment 15 #
Motion for a resolution
Citation 7 b (new)
Citation 7 b (new)
Amendment 18 #
Motion for a resolution
Recital A
Recital A
A. whereas e-commerce influences the everyday lives of people, businesses and consumers in the Union, and when operated in a fair and regulated level playing field, may have contributed positively to unlocking the potential of the Digital Single Market,; whereas further discussion is needed in order to find out whether and how to enhance consumer trust and provide newcomers, and in particular micro, small and medium enterprises, with new market opportunities for sustainable growth and jobs;
Amendment 28 #
Motion for a resolution
Recital B
Recital B
B. whereas the Directive 2000/31/EC of the European Parliament and of the Council2 (“the E-Commerce Directive”) has been one of the most successful pieces of Union legislation and has shaped the Digital Single Market as we know it today; whereas the E-Commerce Directive was adopted 20 years ago and it may no longer adequately reflects the rapid transformation and expansion of e- commerce in all its forms, with its multitude of different emerging services, providers and challenges; __________________ 2 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
Amendment 29 #
Motion for a resolution
Recital B a (new)
Recital B a (new)
Ba. whereas Directive 2005/29/EC concerning unfair business-to-consumer commercial practices in the internal market as amended by Directive (EU) 2019/2161 (EU) and Directives (EU) 2019/770 and (EU) 2019/771 on certain aspects concerning contracts for the supply of digital content and digital services and contracts for the sale of goods have only recently been adopted; whereas other proposals such as the proposal for Regulation on preventing the dissemination of terrorist content online and the proposal for a Directive on representative actions for the protection of the collective interests of consumers are in the legislative process;
Amendment 30 #
Motion for a resolution
Recital C
Recital C
C. whereas, despite the clarifications made by the European Court of Justice, the need to go beyond the existing regulatory framework is clearly demonstrated by the fragmented approach of Member States to tackling illegal content online, by the lack of enforcement and cooperation between Member State, and by the inability of the existing legal framework to promote effective market entry and consumer welfarere seems to be a lack of enforcement and cooperation between Member States;
Amendment 36 #
Motion for a resolution
Recital C a (new)
Recital C a (new)
Ca. whereas recent efforts to introduce national regulations within the scope of the announced Digital Services Act could undermine the achievements made regarding the Digital Single Market and introduce barriers to the detriment of cross-border commerce;
Amendment 40 #
Motion for a resolution
Recital D
Recital D
D. whereas the social and economic challenges brought by the COVID-19 pandemic are showing the resilience of the e-commerce sector and its potential as a driver for relaunching the European economy; whereas, at the same time, the pandemic has also exposed serious shortcomiCOVID-19 outbreak caused major supply and demand shocks, adversely affected European businesses and has brought new social and economic challenges of the current regulatory framework which call for action at Union level to address the difficulties identified and to prevent them from happenthat deeply affect our citizens; whereas the e-commerce sector showed resilience and offers potential as a driver for relaunching in the futureEuropean economy;
Amendment 47 #
Motion for a resolution
Recital D a (new)
Recital D a (new)
Da. whereas legal certainty and business-friendly legislation is essential to seed and grow innovative businesses in the Union, and to further close the gap to the global digital leaders;
Amendment 55 #
Motion for a resolution
Recital E a (new)
Recital E a (new)
Ea. whereas the E-Commerce Directive requires platforms to take down illegal activity and illegal information but does not define them, which makes it hardly distinguishable from other harmful but not illegal content;
Amendment 57 #
Motion for a resolution
Recital E b (new)
Recital E b (new)
Eb. whereas not the occasional reprehensible cases, but rather relevant data, statistics and analyses should demonstrate a need for any further measures;
Amendment 67 #
Motion for a resolution
Paragraph 1 b (new)
Paragraph 1 b (new)
1b. Reminds that it is of an utmost importance to prepare the proposal cautiously, following facts, statistics and best practices rather than several condemnable cases, outdated or partial statistics, in order to avoid any unintended consequences, hampering innovation and choice of consumers; stresses that gold- plating practices of Union legislation by Member States and unnecessary regulatory burdens or unnecessary restrictions must be avoided and the new obligations for platforms should be proportional and their meaning clear;
Amendment 71 #
Motion for a resolution
Paragraph 1 d (new)
Paragraph 1 d (new)
1d. Reiterates its belief that an evidence-based approach is essential for generating a comprehensive understanding in this field; asks the Commission to provide a detailed analysis on the need for and impact of the Digital Single Act package;
Amendment 73 #
Motion for a resolution
Paragraph 1 f (new)
Paragraph 1 f (new)
1f. Given the specific nature of the services covered by the E-Commerce Directive and the need to involve highly specialized experts, asks the Commission to provide a detailed quantification of the financial burden of the future proposal on the Union budget and the budgets of the Member States;
Amendment 74 #
Motion for a resolution
Paragraph 1 g (new)
Paragraph 1 g (new)
1g. Welcomes the Commission soft- law instruments used in recent years to help understanding of legislative environment of platforms for all stakeholders, such as Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online; believes that the Commission should issue guidelines and recommendations for explaining digital services regulatory environment in order to secure rights of online users while stimulating innovation;
Amendment 95 #
Motion for a resolution
Paragraph 3
Paragraph 3
3. ConsiderStresses that the main principles of the E-Commerce Directive, such as the internal market clause, freedom of establishment and the prohibition on imposing a general monitoring obligation should be maintained; underlines that the principle of “what is illegal offline is also illegal online”, as well as the principles of; considers that the consumer protection and user safety, should also become guiding principles of the future regulatory framework;
Amendment 98 #
Motion for a resolution
Paragraph 3 a (new)
Paragraph 3 a (new)
3a. Notes that rules on consumer protection and user safety, including their enforcement rules, are well established by both, the EU and national legislation; Asks the Commission to provide analyses about the enforcement of these rules and potential shortcomings in enforcement;
Amendment 100 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Stresses that a future-prooffair competition and a predictable, comprehensive EU-level framework and fair competiwithout unnecessary burdens and restrictions are crucial in order to promote the growth of all businesses in the field, including European small- scale platforms, small and medium enterprises (SMEs) and start-ups, prevent market fragmentation and provid and provide businesses, including the European businessesones, with a level playing field that enables them to better profit from the digital services market and be more competitive on the world stage;
Amendment 110 #
Motion for a resolution
Paragraph 5
Paragraph 5
Amendment 123 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Takes the view that a level playing field in the internal market between the platform economy and the "traditional" offline economy, based on the same rights and obligations for all interested parties - consumers and businesses - is needed; considers that social protection and social rights of workers, especially of platform or collaborative economy workers should be properly addressed in a specific instrument, accompanying the future regulatory framework;
Amendment 125 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Takes the view that a level playing field in the internal market between the platform economy and the "traditional" offline economy, based on the same rights and obligations for all interested parties - consumers and businesses - is needed; considerdifferentiating between the “digital” single market and the “offline” single market does not describe market realities; supports a level playing field for all participants of the internal market; notes that social protection and social rights of workers, especially of platform or collaborative economy workers are subject to national policies and should be properonly addressed ion a specific instrument, accompanying the future regulatory frameworkthe EU level in accordance to the proportionality and subsidiarity principles;
Amendment 139 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Considers that the Digital Services Act should be based on public values of the Union protecting citizens’ rights and particularly the safeguard of freedom of speech and expression, should aim to foster the creation of a rich and diverse online ecosystem with a wide range of online services, favourable digital environment and legal certainty to unlock the full potential of the Digital Single Market;
Amendment 155 #
Motion for a resolution
Paragraph 7 a (new)
Paragraph 7 a (new)
7a. Asks the Commission to take into account whether reciprocal obligations from third countries adopted in reaction to the new EU rules would not hamper provision of services by EU based companies in third countries;
Amendment 158 #
Motion for a resolution
Subheading 1 a (new)
Subheading 1 a (new)
Innovation and growth
Amendment 159 #
Motion for a resolution
Paragraph 7 b (new)
Paragraph 7 b (new)
7b. Reminds a common interest to support and enhance research, innovation and growth of competition on the digital market; notes that different rules for different providers of information society services, based on their size or other criteria might violate the meaning of fair competition rules; notes that too prescriptive and strict rules have the potential to hamper innovation;
Amendment 170 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Notes that information society services providers, and in particular online platforms and social networking sites - because of their wide-reaching ability to reach and influence broader audiences, behaviour, opinions, and practices - bear significant social responsibility in termsshould cooperate ofn protecting users and society at large and on preventing their services from being exploited abusively.
Amendment 172 #
Motion for a resolution
Paragraph 8 a (new)
Paragraph 8 a (new)
8a. Stresses that confusing the role a private platform should play with those more properly within the remit of public bodies charged with enforcing or setting the law is unacceptable and creates risks for both citizens and businesses, neither of which are qualified to take such decisions;
Amendment 177 #
Motion for a resolution
Paragraph 9
Paragraph 9
9. Recalls that recent scandals regarding data harvesting and selling, Cambridge Analytica, fake news, political advertising and manipulation and a host of other online harms (from hate speech to the broadcast of terrorism) have shown the need to revisit the existing rules and reinforce fundamental rights; considers that any reflection should consider how to reinforce fundamental rights, especially freedom of expression; recalls in this respect certain established self-regulatory and co-regulatory schemes such as the Code of Practice on disinformation, which have played a positive role in addressing those issues and could serve as a basis for future legislation;
Amendment 188 #
Motion for a resolution
Paragraph 10
Paragraph 10
10. Stresses that as it is the case with the E-Commerce Directive, the Digital Services Act should achieve the right balance between the internal market freedoms and the fundamental rights and principles set out in the Charter of Fundamental Rights of the European Union;
Amendment 189 #
Motion for a resolution
Paragraph 10
Paragraph 10
10. Stresses that the Digital Services Act should achievebe based on the pright balance betweennciples of the internal market freedoms and the fundamental rights and principles set out in the Charterrecognition of Ffundamental Rrights of the European Union;
Amendment 197 #
Motion for a resolution
Paragraph 11
Paragraph 11
11. Notes that the COVID-19 pandemic has shown how vulnerable EU consumers are toexposed the challenges EU consumers may face when shopping online, e.g. misleading trading practices by dishonest traders selling fake or illegal products online that are not compliant with Union safety rules or imposing unjustified and abusive price increases or other unfair conditions on consumers; recalls however, the number of proactive measures introduced by some online platforms that are addressing these issues;
Amendment 206 #
Motion for a resolution
Paragraph 12
Paragraph 12
12. Stresses that this problem is aggravated by the fact that often the identity of these companies cannot be established; and recalls that recent legislation adopted under the « New Deal for Consumers » imposes transparency obligations on marketplaces, making it clear with whom a consumer is contracting;
Amendment 220 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Considers that the current transparency and information requirements set out in the E-Commerce Directive on information society services providers and their business customonline sellers, and the minimum information requirements on commercial communications, should be substantially strengthenproperly analysed and subsequently, if needed, improved;
Amendment 234 #
Motion for a resolution
Paragraph 14
Paragraph 14
14. Calls on the Commission to analyse the need to require service providers to verify the information and identity of the business partners with whom they have a contractual commercial relationship, and to ensurequire that the information they provide is accurate and up-to-date;
Amendment 248 #
Motion for a resolution
Paragraph 15
Paragraph 15
15. Calls on the Commission to introduce enforceable obligations on internet service providers aimed at increasing transparency and information; considers that these obligations should be enforced by appropriate, proportionate, effective and dissuasive penalties;
Amendment 258 #
Motion for a resolution
Paragraph 16
Paragraph 16
16. Stresses that existing obligations, set out in the E-Commerce Directive and the Directive 2005/29/EC of the European Parliament and of the Council (‘Unfair Commercial Practices Directiveʼ)3 on transparency of commercial communications and digital advertising should be strengthenfrequently reviewed; points out that pressing consumer protection concerns about profiling, targeting and personalised pricing cannot be addressed by transparency obligawere recently addressed in the “New Deal for Consumers”3a legislation which awaits full transpositions and left to consumer choice aloneenforcement; __________________ 3 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (OJ L 149, 11.6.2005, p. 22). 3aDirective of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules
Amendment 274 #
Motion for a resolution
Paragraph 17
Paragraph 17
17. Believes that while AI-driven services, currently governed by the E- commerce Directive, have enormous potential to deliver benefits to consumers and service providers, the new Digital Services Act should also address the challenges they present in terms of ensuring non-discrimination, transparency and explainability of inputs for algorithms, as well as liability; points out the need to monitor algorithms annd outputs for which are algorithms optimised; points out the need to assess associated risks of using AI, to use high quality and unbiased underlying datasets, as well as to help individuals acquire access to diverse content, opinions, high quality products and services;
Amendment 287 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Considers that consumers should be properly informed and their rights should be effectively guaranteed when they interact with automated decision-making systems and other innovative digital services or applications; believes that it is and it should be possible for consumers to request checks and corrections of possible mistakes resulting from automated decisions, as well as to seek redress for any damage related to the use of automated decision-making systems; believes that a decision issued via automated decision- making should be a subject of a remedy which is made out of an automated system, i.e. by human assessment;
Amendment 294 #
Motion for a resolution
Paragraph 18 a (new)
Paragraph 18 a (new)
18a. Notes that automated content moderation tools are incapable of effectively understanding the subtlety of context and meaning in human communication, which is necessary to determine whether assessed content may be considered to violate the law or terms of service; stresses therefore that the use of such tools should not be mandated by law;
Amendment 316 #
Motion for a resolution
Paragraph 20
Paragraph 20
20. Notes that there is no ‘one size fits all’ solution to all types of illegal and harmful content and cases of misinformation online; believes, however, that a more aligned approach at Union level, taking into account the different types of content,recalls the fact that misinformative and harmful content is not always illegal; requests further to establish a definition of illegal information and activities to simplify compliance; believes, that a more aligned approach at Union level will make the fight against illegal content more effective;
Amendment 332 #
Motion for a resolution
Paragraph 21
Paragraph 21
21. Considers that voluntary actions and self-regulation by online platforms across Europe have brought some benefits, but additional measures are needed in order to ensure the swift detection and removal of illegal content online; ; points that codes of conduct on countering illegal hate speech online improved the response of the platforms to the flagged content to 89% within 24 hours, 95 % under 48 hours, 99.3 % in a week; asks the Commission for the code of conduct on actions related to feedback provided to users of platforms, to ensure that users are informed how their notifications were resolved;
Amendment 341 #
Motion for a resolution
Paragraph 21 a (new)
Paragraph 21 a (new)
21a. Considers that more legal clarity is needed to encourage platforms and information society services providers to engage in additional voluntary actions for content moderation, above what is required by law; points out that the current EU legal regime creates an incentive for platforms and information society services providers to either refrain from taking reasonable proactive moderation, or to over-remove valuable content in the course of moderating for fear of losing their safe harbour protections and facing legal consequences;
Amendment 346 #
Motion for a resolution
Paragraph 21 b (new)
Paragraph 21 b (new)
21b. Considers that any deployment of voluntary measures for content moderation shall not be treated as information society services providers having actual knowledge about illegal activities happening on their platforms, underlines that information society services providers shall not be held liable if they have not obtained actual knowledge or awareness of such activities; stresses that the limited liability principle has been one of the key enablers of European innovation;
Amendment 348 #
Motion for a resolution
Paragraph 22
Paragraph 22
22. Calls on the Commission to address the increasing differences and fragmentations of nationalo which extend national rules in the Member States are circumventing the basic rules in of the Member StatesE-Commerce Directive - the country of origin principle and to propose concrete non-legislative or legislative measures including a transparent notice- and-action mechanism, that can empower both users to notifyand online intermediaries of the existence ofto deal appropriately with potentially illegal online content or behaviour, help information service providers to make faster and more precise decision on content moderation and which could empower the enforcement authorities to apply existing rules in a coherent and legally sound way; is of the opinion that such measures would guarantee a high level of users' and consumers' protection while promoting consumer trust in the online economy; stresses that content moderation rules and decisions should be clear and predictable for consumers;
Amendment 353 #
Motion for a resolution
Paragraph 22
Paragraph 22
22. Calls on the Commission to address the increasing differences and fragmentations of national rules in the Member States and to propose concrete legislative measures including a well- defined notice- and-actiotakedown mechanism with boundaries, that can empower users to notify online intermediaries of the existence of potentially illegal online content or behaviour; highlights that such mechanism could be only complete if it is introduced together with a counter-notice mechanism; is of the opinion that such measures would guarantee a high level of users' and consumeprotection to all actors' protectionarticipating in the system, while promoting consumer trust in the online economy;
Amendment 372 #
Motion for a resolution
Paragraph 23
Paragraph 23
23. Stresses that maintaining safeguards from the legal liability regime for hosting intermediaries with regard to user-uploaded content and the general monitoring prohibition set out in Article 15 of the E-Commerce Directive are still relevant and need to be preserved; reminds that the “primary” liability for illegal content should stay with a person uploading this content and should be different in volume and severity from “secondary” liability of service provider, i.e. responsibility for timely removal of illegal content;
Amendment 383 #
Motion for a resolution
Paragraph 23 a (new)
Paragraph 23 a (new)
23a. Asks the Commission to consider the introduction of the good Samaritan clause whereby service providers that use voluntary measures to detect and remove illegal content online should not lose their liability protection; reminds that voluntary content moderation measures does not necessarily means full knowledge about illegal content uploaded by users and cannot in any case mean introduction of general monitoring principle in any form;
Amendment 407 #
Motion for a resolution
Paragraph 25
Paragraph 25
25. Stresses that it is unacceptable that Union consumers are exposed to illegal and unsafe products, containing dangerous chemicals, as well as other safety hazards; notes in this context the existence of the Rapid Alert System for dangerous non- food products;
Amendment 423 #
Motion for a resolution
Paragraph 26 a (new)
Paragraph 26 a (new)
26a. Asks the Commission to provide exact data and analyses on unsafe and dangerous products originated from both the Union and third countries;
Amendment 439 #
Motion for a resolution
Paragraph 27
Paragraph 27
27. Notes that, today, some markets are characterised by large platforms with significant network effects which are able to act as de facto “online gatekeepers” of the digital economy; notes, however, that concentration in the digital economy as measured by the Herfindahl-Hirschman Index (HHI) is actually stagnating or decreasing;
Amendment 451 #
Motion for a resolution
Paragraph 28
Paragraph 28
28. Considers that by reducing barriers to market entry and by regulating large platforms, an internal market instrument imposing ex-ante regulatory remedies on these large platforms, including regulatory barriers, has the potential to open up markets to new entrants, including SMEs and start-ups, thereby promoting consumer choice and driving innovation beyond what can be achieved by competition law enforcement alone; stresses that ex-ante measures should be in line with the antitrust rules within the competition framework of the Union;
Amendment 469 #
Motion for a resolution
Paragraph 29
Paragraph 29
29. Believes that, in view of the cross- border nature of digital services, effective supervision and cooperation between Member States, including sharing the best practices, is key to ensuringe the proper enforcement of the Digital Services Act;
Amendment 498 #
Motion for a resolution
Paragraph 32
Paragraph 32
32. Calls on the Commission to gather information on all alternative dispute settlement solutions in Member States, provide data on their functioning and analyse whether there is a need and a possibility to strengthen and modernise the current provisions on out-of-court settlement and court actions to allow for an effective enforcement and consumer redress;
Amendment 504 #
Motion for a resolution
Annex I – part -I (new)
Annex I – part -I (new)
-1 The Digital Services Act package should be evidence-based and its impact assessment should inter alia include quantification of the financial burden on the Union budget and the budgets of the Member States;
Amendment 514 #
Motion for a resolution
Annex I – part I – paragraph 3
Annex I – part I – paragraph 3
The Digital Services Act should provide consumers and economic operators, especially micro, small and medium-sized enterprises, with legal certainty and transparency, support innovation while reducing barriers to market entry and provision of services, including regulatory barriers;
Amendment 536 #
Motion for a resolution
Annex I – part I – paragraph 6 – indent 1 – subi. 2
Annex I – part I – paragraph 6 – indent 1 – subi. 2
- clear and detailed procedures and measures related to the removal of illegal content online, including a harmonised legally-bindingcode of conduct on European notice-and -action mechanism;
Amendment 538 #
Motion for a resolution
Annex I – part I – paragraph 6 – indent 1 – subi. 3
Annex I – part I – paragraph 6 – indent 1 – subi. 3
- effective national supervision, cooperation and sanctionmong Member States and proportionate sanctions with the preference for behavioural remedies;
Amendment 545 #
Motion for a resolution
Annex I – part I – paragraph 6 – indent 2
Annex I – part I – paragraph 6 – indent 2
- an internal market legal instrument imposing ex-ante obligations on large platforms with a confirmed gatekeeper role in the digital ecosystem, complemented by an effective institutional enforcement mechanism.
Amendment 548 #
Motion for a resolution
Annex I – part II – paragraph 1
Annex I – part II – paragraph 1
In the interest of legal certainty, the Digital Services Act should clarify which digital services fall within its scope. The new legal act should follow the horizontal nature of the E-Commerce Directive and apply not only to online platforms but to all digital services, which are not covered by specific legislation;
Amendment 562 #
Motion for a resolution
Annex I – part II – paragraph 4
Annex I – part II – paragraph 4
The Digital Services Act should maintain the possibility for Member States to set a highern effective level of consumer protection, maximizing consumer welfare and pursue legitimate public interest objectives in accordance with EU law;
Amendment 567 #
Motion for a resolution
Annex I – part II – paragraph 6
Annex I – part II – paragraph 6
The Digital Services Act should also clarify in a coherent way how its provisions interact with recently adopted rules on geo-blocking, product safety, platforms to business relations and consumer protection, among others, and other anticipated initiatives such as AI regulation;
Amendment 573 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 1
Annex I – part III – paragraph 1 – indent 1
- clarify if and to what extent “new digital services”, such as social media networks, collaborative economy services, search engines, wifi hotspots, online advertising, cloud services, content delivery networks, and domain name services fall within the scope of the Digital Services Act;
Amendment 581 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 4
Annex I – part III – paragraph 1 – indent 4
- clarify of what falls within the remit of the "illegal content” definition making it clear that a violation of EU rules on consumer protection, product safety or the offer or sale of food or tobacco products and counterfeit medicines, also falls within the definition of illegal content;
Amendment 582 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 4
Annex I – part III – paragraph 1 – indent 4
- clarify of what falls within the remit of the "illegal content” definition making it clear that a violation of EU rules on consumer protection, product safety or the offer or sale of food or tobacco products and counterfeit medicines, also falls within the definition of illegal contentand “illegal activity” definitions;
Amendment 592 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 5
Annex I – part III – paragraph 1 – indent 5
- define “systemic operator” by establishing a set of clear economic indicators and their trends that allow regulatory authorities to identify platforms with a “gatekeeper” role playing a problematic systemic role in the online economy; such indicators could include considerations such as whether the undertaking is active to a significant extent on multi-sided markets, the size of its network (number of users, user time spent), its financial strength, access to data, vertical integration, the importance of its activity for third parties’ access to supply and markets, any barrier to provision of services by its competitor etc.
Amendment 605 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 1 – indent 1
Annex I – part IV – paragraph 1 – subparagraph 1 – indent 1
Amendment 623 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – introductory part
Annex I – part IV – paragraph 1 – subparagraph 2 – introductory part
The Digital Services Act should require service providers to adopt fair and transparent contract terms and general conditions in compliance with at least the followingcombining existing and new requirements:
Amendment 626 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 2
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 2
Amendment 631 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 4
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 4
Amendment 634 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 4 a (new)
Annex I – part IV – paragraph 1 – subparagraph 2 – indent 4 a (new)
- to ensure that cancellation process is similarly effortless as the sign-up process (with no “dark patterns” or other influence on consumer decision);
Amendment 647 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 3 – indent 3
Annex I – part IV – paragraph 1 – subheading 3 – indent 3
- TheIf technically feasible, proportionate and proven to provide the added-value, transparency requirements shcould include the obligation to disclose who is paying for the advertising, including both direct and indirect payments or any other contributions received by service providers; those requirements should apply also to platforms, even if they are established in third countries; consumers and public authorities should be able to identify who should be held accountable in case of, for example, false or misleading advertisement;
Amendment 648 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 3 – indent 3
Annex I – part IV – paragraph 1 – subheading 3 – indent 3
- The transparency requirements should include the obligation to disclose who is paying for the advertising, including both direct and indirect payments or any other contributions received by service providers; those requirements should apply also to platforms, even if they are established in third countries; consumers and public authorities should be able to identify who should be held accountable in case of, for example, false or misleading advertisement;
Amendment 652 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 3 – indent 4
Annex I – part IV – paragraph 1 – subheading 3 – indent 4
- if proven by analyses, Article 7 of the E-Commerce Directive should be revised or supported by effective enforcement measures in order to protect consumers from unsolicited commercial communications online.
Amendment 655 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4
Annex I – part IV – paragraph 1 – subheading 4
Amendment 659 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4
Annex I – part IV – paragraph 1 – subheading 4
Amendment 684 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subparagraph 4
Annex I – part IV – paragraph 1 – subparagraph 4
The compliance of the due diligence provisions should be reinforced with effective, proportionate and dissuasive penalties, including the imposition of reasonable fines.
Amendment 689 #
Motion for a resolution
Annex I – part V – paragraph 1 – introductory part
Annex I – part V – paragraph 1 – introductory part
The Digital Services Act or other ancillary non-legislative measures should provide clarity and guidance regarding how online intermediaries should tackle illegal content online while fully respecting the “no general monitoring” principle. The revised rules of the E- Commerce Directive should:
Amendment 718 #
Motion for a resolution
Annex I – part V – subheading 1
Annex I – part V – subheading 1
1. A notice-and-actiotakedown mechanism
Amendment 722 #
Motion for a resolution
Annex I – part V – paragraph 2 – introductory part
Annex I – part V – paragraph 2 – introductory part
The Digital Services Act should establish a harmonised and legally enforceablecreate non-binding guidelines for notice-and- action mechanism based on a set of clear processes and precise timeframes for each step of the notice-and- action procedure. That notice-and-action mechanism should:
Amendment 724 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 1
Annex I – part V – paragraph 2 – indent 1
- apply only to illegal online content or behaviour;
Amendment 752 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 11
Annex I – part V – paragraph 2 – indent 11
- create an obligation for the online intermediaries to verify the notified contentcontent of the notice and reply to the notice provider and the content uploader with a reasoned decision;
Amendment 753 #
Motion for a resolution
Annex I – part V – paragraph 2 – indent 12
Annex I – part V – paragraph 2 – indent 12
- provide remedies to contest the decision via a counter-notice, including if the content that has been removed via automated solutions, if technically feasible and free from the risk of exposing the underlying technology and allowing « gaming » of the system, or unless such a counter- notice would conflict with an ongoing investigation by law enforcement authorities.
Amendment 767 #
Motion for a resolution
Annex I – part V – subheading 2 – indent 3
Annex I – part V – subheading 2 – indent 3
- All interested parties should have the right to contest the decision through a counter-notice and by having recourse to out-of-court dispute settlement mechanism; to this end, the rules of Article 17 of the E-Commerce Directive should be revised.
Amendment 781 #
Motion for a resolution
Annex I – part V – paragraph 3 – indent 5
Annex I – part V – paragraph 3 – indent 5
- the description of the content moderation model applied by the hosting intermediary, as well as any algorithmic decision making which influences the content moderation process.
Amendment 793 #
Motion for a resolution
Annex I – part V – paragraph 5
Annex I – part V – paragraph 5
The Digital Services Act should address the lack of legal certainty regarding the concept of active vs passive hosts. The revised measures should clarify if interventions by hosting providers having editorial functions and a certain “degree of control over the data,” through tagging, organising, promoting, optimising, presenting or otherwise curating specific content for profit- making purposes and which amounts to adoption of the third-party content as one’s own (as judged by average users or consumers) should lead to a loss of safe harbour provisions due to their active nature.
Amendment 804 #
Motion for a resolution
Annex I – part V – paragraph 6 a (new)
Annex I – part V – paragraph 6 a (new)
Voluntary measures A voluntary measures clause would encourage companies to engage in additional voluntary actions for content moderation, above what is required by law. The purpose would be to remove an assumption and a risk that if a company engages in a good faith in such voluntary actions, it automatically loses the safe harbour protection. In the current legislative environment, companies undertake such measures at their own risk, as they may incur liability for failing to act in relation to illegal content that they identify, even when they conclude in good faith that the content need not be removed. The risk of liability creates a perverse incentive for companies to either refrain from taking reasonable proactive moderation, or to over-remove valuable content in the course of moderating and consequently possibly violating the freedom of speech or other fundamental rights. A voluntary measures clause would also ensure that where a platform or an information society service provider has voluntarily reviewed one or more pieces of content in respect of one or more types of unlawfulness (or for violations of its content policies, e.g., defamation), the provider is not deemed to have knowledge of the unlawfulness of other, unreviewed, pieces of content on its platform (copyright violations). Equally, the provision would ensure that where the information society service provider has voluntarily reviewed content in respect of one or more types of unlawfulness (or for violations of its content policies), the provider is not deemed to have knowledge of all of the other potential ways in which that same content might be unlawful.
Amendment 813 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 4
Annex I – part VI – paragraph 2 – indent 4
- ensure that online marketplaces remove, in accordance with notification made by relevant authorities any misleading information given by the supplier or by customers, including misleading guarantees and statements made by the supplier;
Amendment 825 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 5
Annex I – part VI – paragraph 2 – indent 5
- once products have been identified as unsafe by the Union’s rapid alert systems or by consumer protection authorities, it should be compulsory to remove products from the marketplace within 24 hoursreasonable time;
Amendment 844 #
Motion for a resolution
Annex I – part VI – paragraph 2 – indent 9
Annex I – part VI – paragraph 2 – indent 9
- explore expanding the commitment made by somepositive incentives that could e-ncommerce retailers and the Commission to remove dangerous products from sale more rapidly under the voluntary commitment scheme called “Product Safety Pledge” and indicate which of those commitments could become mandatoryurage further companies to join the voluntary commitment scheme called “Product Safety Pledge”.
Amendment 858 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 1
Annex I – part VII – paragraph 2 – indent 1
- set up an ex-ante mechanism to prevent (instead of merely remedy) unfair marketmarket failures caused by the behaviour byof “systemic platforms” in the digital world, building on the Platform to Business Regulation; such mechanism should allow regulatory authorities to impose remedies on these companies in order to address market failures, without the establishment of a breach of regulatory rules;
Amendment 863 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 2
Annex I – part VII – paragraph 2 – indent 2
- empower regulatory authorities to issue orders prohibiting undertakings, which have been identified as “systemic platforms”, from the following practices, inter alia: discrimination in intermediary services; making the use of data for making market entry by third parties more difficult; and engaging in practices aimed at locking- in consumers; in response to detailed findings by a regulatory authority, undertakings should be given the possibility to demonstrate that the behaviour in question is justified, yet they should bear the burden of proof for this prior to any order entering into force;
Amendment 865 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 2
Annex I – part VII – paragraph 2 – indent 2
- empower regulatory authorities to issue orders prohibiting undertakings, which have been identified as “systemic platforms”, from the following practices, inter alia: discrimination in intermediary services; making the use of data for making market entry by third parties more difficult; and engaging in practices aimed at locking- in consumers yet authorities should bear the burden of proof for this; undertakings should be given the possibility to demonstrate that the behaviour in question is justified, yet they should bear the burden of proof for this;
Amendment 869 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 3
Annex I – part VII – paragraph 2 – indent 3
Amendment 874 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 4
Annex I – part VII – paragraph 2 – indent 4
- empower regulatory authorities to adopt interim measures, and to impose behavioural remedies in the first instance and if these are not satisfied within the time limit settled by authorities, subsequently proportionate fines on “systemic platforms” that fail to respect the different regulatory obligations imposed on them;
Amendment 881 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 6
Annex I – part VII – paragraph 2 – indent 6
- impose highappropriate levels of interoperability measures requiring “systemic platforms” to share appropriate tools, data, expertise, and resources deployed in order to limit the risks of users and consumers’ lock-in and the artificially binding users to one systemic platform with no possibility or incentives for switching between digital platforms or internet ecosystems, taking into account the trade-off between interoperability and the potential risks of data sharing for consumers. As part of those measures, the Commission should explore different technologies and open standards and protocols, including the possibility of a mechanical interface (Application Programming Interface) that allows users of competing platforms to dock on to the systemic platform and exchange information with it. Related detailed estimations of the financial burden on the EU or national budgets must be included in the impact assessment of the Digital Services Act package.
Amendment 884 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 6
Annex I – part VII – paragraph 2 – indent 6
- impose high levels of interoperability measures requiring “systemic platforms” operating in the same markets to share appropriate tools, data, expertise, and resources deployed in order to limit the risks of users and consumers’ lock-in and the artificially binding users to one systemic platform with no possibility or incentives for switching between digital platforms operating in the same markets or internet ecosystems. As part of those measures, the Commission should explore different technologies and open standards and protocols, including the possibility of a mechanical interface (Application Programming Interface) that allows users of competing platforms to dock on to the systemic platform and exchange information with it.