BETA

Activities of Martina MICHELS related to 2020/0361(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
2021/10/05
Committee: CULT
Dossiers: 2020/0361(COD)
Documents: PDF(421 KB) DOC(265 KB)
Authors: [{'name': 'Sabine VERHEYEN', 'mepid': 96756}]

Amendments (17)

Amendment 115 #
Proposal for a regulation
Recital 1
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25, new and innovative business models and services, such as online social networks and marketplaces, have allowed business users, including institutions of public life, and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users, for institutional users such as educational and cultural institutions, and for society as a whole. _________________ 25Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
2021/07/23
Committee: CULT
Amendment 143 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly andbut must be determined separately from legal harmful content such as equivalent relevant information on 'illegal content'; it also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/23
Committee: CULT
Amendment 151 #
Proposal for a regulation
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that informatione dissemination and meaning of that information, in the context of automatic, quantity-statistical or broader computer-linguistic document analysis, and which it actively employs. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
2021/07/23
Committee: CULT
Amendment 164 #
Proposal for a regulation
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should cannot be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in ques, and yet complete faith is in some cases placed in non-transparent programmed routines being used for automatic content recognition. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.
2021/07/23
Committee: CULT
Amendment 174 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safefreedom of information and data security and trust of the recipients of the service, including minors and vulnerable users, protectand the relevant fundamental rights to freedom of expression and protection against discrimination enshrined in the Charter, and to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/23
Committee: CULT
Amendment 204 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that theybe conferred primarily on organisations, industry representatives and public entities that have particular expertise and competence in tackling illegal content, that as they represent collective interests and that they work in a diligent and objthe corresponding collective mannerd public interests. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, right holders, organisations of industry and of right- holdercollecting societies could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43. _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, (OJ L 135, 24.5.2016, p. 53).
2021/07/23
Committee: CULT
Amendment 212 #
Proposal for a regulation
Recital 51
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipients of the service in the Unionall the Member States in which their intermediary services are used.
2021/07/23
Committee: CULT
Amendment 214 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, - as a result in particular as expressed inof the number of active recipients of the service, - in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligationdue diligence obligations concerning the guarantee of rights of information and fundamental rights and the transparency of free, non-discriminatory communication and business performance for business customers on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
2021/07/23
Committee: CULT
Amendment 221 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected and guaranteed by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non- discrimination and, the rights of the child. Such risks and the protection of women from doxing and gender-specific cybermobbing, which in many arise, for example, in relation to the designMember States are still not even classified as specific criminal offences. Such risks may, for example, be incorporated in the basic programming of the algorithmic systems used by the very large online platform or arise from the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition and the freedom to exercise a profession. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/23
Committee: CULT
Amendment 225 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. This will involve ensuring digital care, the internal removal of illegal content, in particular the deletion of previously-identified scenes of violence in illegal content, by means of psychological support and the imposition of an appropriate time limit for these tasks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/23
Committee: CULT
Amendment 255 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/07/23
Committee: CULT
Amendment 276 #
Proposal for a regulation
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. No provision of this Regulation should be understood as prescribing, promoting or recommending the use of automated decision-making or the monitoring of the behaviour of a large number of natural persons - not even for statistical purposes.
2021/07/23
Committee: CULT
Amendment 370 #
Proposal for a regulation
Article 18 – paragraph 2 – point c
(c) the dispute settlement is made easily accessible, including for persons with disabilities, through electronic communication technology;
2021/07/23
Committee: CULT
Amendment 377 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform or state law enforcement authorities;
2021/07/23
Committee: CULT
Amendment 403 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Online platforms that display advertising on their online interfaces shallmay use profiling only by agreement, not as a default setting. They must ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time:
2021/07/23
Committee: CULT
Amendment 477 #
Proposal for a regulation
Article 40 – paragraph 1
1.(1) The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation. This will require structured cooperation with the Digital Services Coordinators of the Member States and the Board identified in Chapter VI, in order to implement, EU- wide, the due diligence obligations of large platforms set out in Chapter III.
2021/07/23
Committee: CULT
Amendment 479 #
Proposal for a regulation
Article 46 – paragraph 2
2.(2) Where a Digital Services Coordinator of establishment hasor the Digital Services Coordinators of at least three Member States have reasons to suspect that a very large online platform infringed this Regulation, it may request the Commission to take the necessary investigatory and enforcement measures to ensure compliance with this Regulation in accordance with Section 3. Such a request shall contain all information listed in Article 45(2) and set out the reasons for requesting the Commission to intervene.
2021/07/23
Committee: CULT