24 Amendments of Martin SCHIRDEWAN related to 2020/2018(INL)
Amendment 12 #
Motion for a resolution
Citation 7 a (new)
Citation 7 a (new)
- having regard to the study by Dr Melanie Smith "Enforcement and cooperation between Member States in a Digital Services Act", commissioned by the European Parliament’s Committee on Internal Market and Consumer Protection, Luxembourg, 2020,
Amendment 17 #
Motion for a resolution
Citation 7 b (new)
Citation 7 b (new)
- having regard to the opinion of the Committee of the Regions (ECONVI/048) from 5 December 2019 on “a European framework for regulatory responses to the collaborative economy”,
Amendment 37 #
Motion for a resolution
Recital C a (new)
Recital C a (new)
Ca. whereas a small number of companies developed a market dominance by acquiring an unprecedented level of knowledge about people’s lives;
Amendment 135 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Considers that the Digital Services Act should be based on a European public values of the Union protecting citizens’ rights approach going beyond the economic sphere, protecting all fundamental rights, including non- discrimination, privacy, dignity, fairness as well as free speech and rule of law and should aim to foster the creation of a rich and diverse online ecosystem with a wide range of online services, favourable digital environment and legal certainty to unlock the full potential of the Digital Single Market;
Amendment 143 #
Motion for a resolution
Paragraph 6 a (new)
Paragraph 6 a (new)
(1) Considers that while the horizontal approach of the E-Commerce Directive should be maintained, a “one-size-fits-all” approach is not suitable to address all the new challenges in today´s digital landscape; stresses therefore, that the diversity of actors and services offered online need a tailored regulatory approach;
Amendment 192 #
Motion for a resolution
Paragraph 10
Paragraph 10
10. Stresses that the Digital Services Act should achieve the right balance between the internal market freedoms and theensure that fundamental rights and principles set out in the Charter of Fundamental Rights of the European Union are reflected in the internal market freedoms;
Amendment 254 #
Motion for a resolution
Paragraph 16
Paragraph 16
16. Stresses that existing obligations, set out in the E-Commerce Directive and the Directive 2005/29/EC of the European Parliament and of the Council (‘Unfair Commercial Practices Directiveʼ)3 on transparency of commercial communications and digital advertising should be strengthened; points out that pressing consumer protection concerns about profiling, targeting and personalised pricing cannot be addressed by transparency obligations and left to consumer choice alone; considers that practices like profiling deeply interfere with people's rights and freedoms; recognises that the General Data Protection Regulation framework does not adequately protect consumers against profile building and unjustified automated decisions; is of the opinion that in order to ensure adequate protection of consumers, personal data should only be used where it’s necessary to provide the service requested; __________________ 3 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (OJ L 149, 11.6.2005, p. 22).
Amendment 278 #
Motion for a resolution
Paragraph 17 a (new)
Paragraph 17 a (new)
(17a) Considers it necessary to end the “attention- seeking” profiling business model of digital markets, where algorithms priorities controversial content and thus contribute to its spread online; stresses thus, that users should have more control on how rankings are presented, e.g. by giving them the choice to arrange them alternatively;
Amendment 284 #
Motion for a resolution
Paragraph 18
Paragraph 18
18. Considers that consumers should be properly informed in an understandable and easily accessible way and their rights should be effectively guaranteed when they interact with automated decision-making systems and other innovative digital services or applications; believes that it should be possible for consumers to request checks and corrections of possible mistakes resulting from automated decisions, as well as human intervention and consumers should have the right to seek redress for any damage related to the use of automated decision-making systems; considers that the set of rights of consumers should be expanded to better protect them in the digital world, in particular the right to accountability and control and the right to fairness which should be considered in order to foster the necessary trust of consumers in AI applications;
Amendment 331 #
Motion for a resolution
Paragraph 21
Paragraph 21
21. Considers that voluntary actions and self-regulation by online platforms across Europe have brought some benefits, but additional measures are needed in order to ensure the swift detection and removal of illegal content online; considers that instead of applying algorithms for automated filtering technologies, a solid reform of the “notice and take down” framework should be introduced; stresses that in case filters are used, they need to be accompanied by robust safeguards for transparency and accountability with highly skilled independent and impartial public oversight; rejects therefore a “good samaritan clauses” for dominant platforms;
Amendment 401 #
Motion for a resolution
Paragraph 25
Paragraph 25
25. Stresses that it is unacceptable that Union consumers are exposed to illegal and unsafe products, containing dangerous chemicals, as well as other safety hazards; stresses that in view of commercial activities on online market places, self regulation provided to be insufficient and therefore, asks the Commission to introduce strong safeguards and obligations for product safety and consumer protection for commercial activities on online market places, accompanied by a tailored liability regime with proper enforcement mechanisms;
Amendment 433 #
Motion for a resolution
Subheading 7
Subheading 7
Ex ante regulation of systemicdominant platforms
Amendment 434 #
Motion for a resolution
Paragraph 27
Paragraph 27
27. Notes that, today, some markets are characterised by large platforms with significant network effects which are able to act as de facto “online gatekeepers” of the digital economy large platforms acquired a huge amount of data and replaced services of a diverse and decentralised system with open standards by “walled gardens” with locked- in users; stresses that as a consequence some markets are characterised by large platforms with significant network effects which are able to act as de facto “online gatekeepers” of the digital economy; considers it therefore necessary to introduce additional obligations regarding data protection, transparency, user choice and interoperability in order to guarantee a level playing field and consumer welfare;
Amendment 457 #
Motion for a resolution
Paragraph 28 a (new)
Paragraph 28 a (new)
(1) Considers that increased transparency from platforms on data sharing is crucial in view of guaranteeing the functioning of an ex-ante rule regulation; notes that self-reporting without the ability to audit is not sufficient and therefore, stresses that authorities should have powers to compel data from dominant platforms and need to be equipped with staff and resources to properly interpret that data;
Amendment 466 #
Motion for a resolution
Paragraph 29
Paragraph 29
29. Believes that, in view of the cross- border nature of digital services, effective supervision and cooperation between Member States is key to ensuring the proper enforcement of the Digital Services Act; stresses therefore, that it is not only necessary that competent authorities of the country of destination receive all the data required for public administration to fulfil their tasks needed to enforce law, but considers necessary to enlarge the derogations from article 3 in the Annex by adding provisions related to tax and housing policies;
Amendment 515 #
Motion for a resolution
Annex I – part I – paragraph 3
Annex I – part I – paragraph 3
The Digital Services Act should provide consumers and economic operators, especially micro, small and medium-sized enterprises, with legal certainty and transparency and should not apply a "one- size-fits-all" approach;
Amendment 521 #
Motion for a resolution
Annex I – part I – paragraph 4
Annex I – part I – paragraph 4
The Digital Services Act should respect the broad framework of fundamental European rights of users and consumers, such as the protection of privacy, data, non- discrimination, dignity, fairness and free speech;
Amendment 559 #
Motion for a resolution
Annex I – part II – paragraph 3
Annex I – part II – paragraph 3
The Digital Services Act should maintain and consider extending the derogation set out in the Annex of the E-Commerce Directive and, in particular, the derogation of contractual obligations concerning consumer contracts;
Amendment 596 #
Motion for a resolution
Annex I – part III – paragraph 1 – indent 5
Annex I – part III – paragraph 1 – indent 5
- define “systemicdominant operator” by establishing a set of clear economic indicators that allow regulatory authorities to identify platforms with a “gatekeeper” role playing a systemic role in the online economy; such indicators could include considerations such as whether the undertaking is active to a significant extent on multi-sided markets, the size of its network (number of users), its financial strength, access to data, vertical integration, the importance of its activity for third parties’ access to supply and markets, etc.
Amendment 671 #
Motion for a resolution
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 a (new)
Annex I – part IV – paragraph 1 – subheading 4 – indent 3 a (new)
- establish more transparency regarding ranking results and end the “attention- seeking” profiling business model of digital markets, in order to reduce the spread of controversial content and to give users more choice on how rankings are presented;
Amendment 706 #
Motion for a resolution
Annex I – part V – paragraph 1 – indent 3
Annex I – part V – paragraph 1 – indent 3
- preserve the underlying legal principle that online intermediaries should not be held directly liable for the acts of their users and that online intermediaries can continue moderating legal content under fair and transparent terms and conditions of service, provided that they are applicable in a non-discriminatory manner;. In case filters are applied, they need to be accompanied by robust safeguards for transparency and accountability with highly skilled independent and impartial public oversight.
Amendment 769 #
Motion for a resolution
Annex I – part V – subheading 2 – indent 4
Annex I – part V – subheading 2 – indent 4
- If the redress and counter-notice have established that the notified activity or information is not illegal, the online intermediary should restore the content that was removed without undue delay or allow for the re-upload by the user, without prejudice to the platform's terms of service.
Amendment 790 #
Motion for a resolution
Annex I – part V – paragraph 4
Annex I – part V – paragraph 4
The Digital Services Act should protect and, uphold and adapt the current limited exemptions from secondary liability for information society service providers (online intermediaries) provided for in Article 12, 13, and 14 of the current E- Commerce Directive to new challenges in the digital landscape. Therefore, the Digital Services Act should introduce a tailored liability regime with proper enforcement mechanisms for commercial activities on online market places in order to guarantee consumer protection and product safety.
Amendment 877 #
Motion for a resolution
Annex I – part VII – paragraph 2 – indent 5
Annex I – part VII – paragraph 2 – indent 5
- ensure that the rights, obligations and principles of the GDPR – including data minimisation, purpose limitation, data protection by design and by default, legal grounds for processing – must be observed and that shortcomings in view of profile building must be addressed in order to ensure adequate protection of consumers;