Activities of Beata MAZUREK related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (continuation of debate)
Amendments (92)
Amendment 229 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and alsunderpin the general idea that what is illegal offline should also be illegal online. The concept should be defined broadly to covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 237 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. Furthermore, cloud services that have no active role in the dissemination, monetisation and organisation of the information to the public or end users, at their request, should not be considered as online platforms.
Amendment 246 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Concept of 'dissemination to the public' should not apply to cloud services, including business-to-business cloud services, with respect to which the service provider has no contractual rights concerning what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no technical capability to remove specific content stored by their customers or the end-users of their services. Where a service provider offers several services, this Regulation should be applied only in respect of the services that fall within its scope. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 255 #
Proposal for a regulation
Recital 16
Recital 16
(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union, as well as technological and market developments.
Amendment 266 #
Proposal for a regulation
Recital 20
Recital 20
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally andor the main purpose of which is to engage in or facilitate such activities should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
Amendment 270 #
Proposal for a regulation
Recital 21
Recital 21
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature, such as network management, which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.
Amendment 304 #
Proposal for a regulation
Recital 26
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content or that ability originates from the regulatory or contractual provisions, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. Consequently, providers of intermediary services should act on the specific illegal content only if they are in the best place to do so, and the blocking orders should be considered as a last resort measure and applied only when all other options are exhausted.
Amendment 307 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, although they do not fall within the obligations under this Regulations, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 314 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature, imposing constant content identification from the entirety of available content. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
Amendment 378 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Nonetheless, the provider should have the possibility to reject a given notice if there is another entity with more granular control over the alleged content or the provider has no technical capability to act on a specific content. Therefore, the blocking orders should be considered as a last resort measure and applied only when all other options are exhausted. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 452 #
Proposal for a regulation
Recital 51
Recital 51
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipientend users of the service in the Union.
Amendment 454 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The parameters should include, if applicable, the optimisation goal selected by the advertiser, information on the use of custom lists, information on the use of lookalike audiences and in such case – relevant information on the seed audience and an explanation why the recipient of the advertisement has been determined to be part of the lookalike audience, meaningful information about the online platform’s algorithms or other tools used to optimise the delivery of the advertisement, including a specification of the optimisation goal and a meaningful explanation of reasons why the online platform has decided that the optimisation goal can be achieved by displaying the advertisement to this recipient. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 468 #
Proposal for a regulation
Recital 53
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipientactive end users of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
Amendment 471 #
Proposal for a regulation
Recital 54
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses may have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipientactive end users exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. In the process of establishing the methodology to calculate the total number of active end users, the Commission should take due account of the different type of platforms and their operations, as well as the potential need for the end user to register, engage in transaction or content in order to be considered as an active end user. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
Amendment 487 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms mayshould reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They mayshould also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 539 #
Proposal for a regulation
Recital 76
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. In addition in order to ensure effective protection of rights of EU citizens that take into account diverse national laws and difference in socio- cultural context between countries, a Member State should exercise jurisdiction where it concerns online social networking services provided by very large online platforms which offer services to a significant number of recipients in a given Member State. Member States jurisdiction is particularly important in case of very large online platforms which are social networks because they play a central role in facilitating the public debate.
Amendment 567 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, competition, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 649 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
Article 2 – paragraph 1 – point b a (new)
(ba) 'active end user' means an individual successfully accessing an online interface and having significant interaction with it, its product or service;
Amendment 689 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any specific information,, which, in itself or by its reference to an or activity, including the sale of products or provision of services, which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 698 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor andor a purely ancillary feature of another service or functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.
Amendment 703 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
Article 2 – paragraph 1 – point h a (new)
(ha) ‘online social networking service’ means a platform that enables end users to connect, share, discover and communicate with each other across multiple devices and, in particular, via chats, posts, videos and recommendations;
Amendment 710 #
Proposal for a regulation
Article 2 – paragraph 1 – point i
Article 2 – paragraph 1 – point i
(i) ‘dissemination to the public’ means taking an active role in making information available, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties;
Amendment 748 #
Proposal for a regulation
Article 3 – paragraph 1 – introductory part
Article 3 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, or an improvement of the security of that transmission, the service provider shall not be liable for the information transmitted, on condition that the provider:
Amendment 750 #
Proposal for a regulation
Article 3 – paragraph 3
Article 3 – paragraph 3
3. This Article shall not affect the possibility for a court or functionally independent administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 805 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order via a secure communications channel to act against a specific or multiple items of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 839 #
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
Article 8 – paragraph 2 – point c a (new)
(ca) the actor receiving the order has technical and operational ability to act against specific, notified illegal content and has direct control over it.
Amendment 847 #
Proposal for a regulation
Article 8 – paragraph 3 a (new)
Article 8 – paragraph 3 a (new)
3a. The Digital Services Coordinator of each Member State, on its own initiative and within 96 hours of receiving a copy of the order to act through the system developed in accordance with paragraph 4a of this Article, shall have the right to scrutinise the order to determine whether it infringes the respective Member State's law and deem it invalid on its own territory by adopting a reasoned decision.
Amendment 848 #
Proposal for a regulation
Article 8 – paragraph 3 b (new)
Article 8 – paragraph 3 b (new)
3b. Where the Digital Services Coordinator adopts a reasoned decision in accordance with paragraph 3a, (a) the Digital Services Coordinator shall communicate that decision to the authority that issued that order and the concerned provider of the service, and, (b) after receiving a decision finding that the content was not in fact illegal, the concerned provider shall immediately reinstate the content or access thereto in the territory of the Member State of the Digital Services Coordinator who issued the decision.
Amendment 854 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
Article 8 – paragraph 4 a (new)
4a. The Commission shall adopt implementing acts, organising a European information exchange system, allowing for secure communication and authentication of authorised orders between relevant authorities, Digital Services Coordinators and providers, as referred to in Articles 8(1), 8a(1) and 9(1). Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70.
Amendment 856 #
Proposal for a regulation
Article 8 a (new)
Article 8 a (new)
Amendment 861 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order via a secure communications channel to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
Amendment 901 #
Proposal for a regulation
Article 10 – paragraph 2
Article 10 – paragraph 2
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact and ensure that information is up to date. Providers of intermediary services shall notify that information, including the name, postal address, the electronic mail address and telephone number of their single point of contact, to the Digital Services Coordinator in the Member State where they are established.
Amendment 916 #
Proposal for a regulation
Article 11 – paragraph 4
Article 11 – paragraph 4
4. Providers of intermediary services shall notify identification data, including the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date. The Digital Service Coordinator in the Member State where that legal representative resides or is established shall, upon receiving that information, make reasonable efforts to assess its validity.
Amendment 920 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
Article 11 – paragraph 5 a (new)
5a. Providers of online social networking services designated as very large online platform according to Article 25 shall designate a legal representative to be bound to obligations laid down in this Article at the request of the Digital Services Coordinator of the Member States where this provider offers its services.
Amendment 934 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, plain, intelligible and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 954 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Providers designated as very large online platforms as referred to in Article 25, shall publish their terms and conditions in all official languages of the Union.
Amendment 958 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2b. The Digital Services Coordinator of each Member State has the right to request very large online platforms to apply measures and tools of content moderation, including algorithmic decision-making and human review reflecting Member State’s socio-cultural context. The framework for this cooperation as well as specific measures related thereto may be laid down in national legislation and shall be notified to the Commission.
Amendment 961 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2c. The Digital Services Coordinator of each Member State, by means of national legislation, may request a very large online platform to cooperate with the Digital Services Coordinator of the Member State in question in handling cases involving the removal of lawful content online that is taken down erroneously if there is reason to believe that the Member State’s socio-cultural context may have played a vital role.
Amendment 980 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
Article 13 – paragraph 1 – point a
Amendment 986 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action;
Amendment 993 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 1003 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. Following an additional individual risk assessment, the Digital Services Coordinator of establishment may extend the exemption to selected medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 1011 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
Article 13 – paragraph 2 a (new)
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific templates of reports referred to in paragraph 1.
Amendment 1036 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. TNotices submitted under the mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operatoreviewer can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 1044 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, and applicable additional information enabling the identification of the illegal content; which shall be appropriate to the type of content and to the specific type of intermediary;
Amendment 1056 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned where there is no doubt as to the illegality of the specific item of content. In case of uncertainty and after taking reasonable steps to assess the illegality of the specific item of content, withholding from removal of the content by the provider shall be perceived as acting in good faith and shall not lead to waiving the liability exemption provided for in Article 5.
Amendment 1070 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. Where the provider has no technical, operational or contractual ability to act against specific items of illegal content, it may hand over a notice to the provider that has direct control of specific items of illegal content, while informing the notifying person or entity and the relevant Digital Services Coordinator.
Amendment 1098 #
Proposal for a regulation
Article 15 – paragraph 1
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access, or otherwise limit the availability, visibility or accessibility to specific items of information, provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
Amendment 1138 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or, small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. Following an additional, individual risk assessment, the Digital Services Coordinator of establishment may extend the exemption to selected medium-sized enterprises.
Amendment 1164 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) any other decisions that affect the availability, visibility or accessibility of that content or the account of the recipient's access to significant features of the platform's regular services.
Amendment 1186 #
Proposal for a regulation
Article 17 – paragraph 4
Article 17 – paragraph 4
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. The decision mentioned in this paragraph shall also include: - information on whether the decision referred to in paragraph 1 was taken as a result of human review or through automated means; - in case the decision referred to in paragraph 1 is upheld, a detailed explanation on how the information to which the complaint relates to is in breach of the platform’s terms and conditions or why the online platform considers the information to be unlawful.
Amendment 1192 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. Complainants shall have the right to request human review and consultation with relevant online platforms’ staff with respect to content to which the complaint relates to.
Amendment 1197 #
Proposal for a regulation
Article 17 – paragraph 5 a (new)
Article 17 – paragraph 5 a (new)
5a. Recipients of the service negatively affected by the decision of an online platform shall have the possibility to seek swift judicial redress in accordance with the laws of the Member States concerned. The procedure shall ensure that an independent judicial authority decides on the matter without undue delay, reaching a decision within 14 working days while granting the negatively affected party the right to seek interim measures to be imposed within 48 hours from when their redress is brought before this judicial authority. The rights to seek judicial redress and to obtain interim measures shall not be limited or subjected to the condition of exhausting the internal complaint-handling system.
Amendment 1254 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6a. Member States shall establish a mechanism enabling the recipients of the service to contest decisions of out-of-court dispute settlement bodies before a national judicial authority or an administrative authority relevant for resolving disputes related to particular content.
Amendment 1261 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by competent trusted flaggers, addressing allegedly illegal content that can seriously affect public security, policy or consumers' health or safety through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
Amendment 1270 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence that could be exercised in one or more Member States for the purposes of detecting, identifying and notifying specific types of illegal content;
Amendment 1294 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. This communication shall include the geographical scope within which the trusted flagger competence was recognised based on the approval of a particular Digital Services Coordinator and information on expertise and competence declared by the trusted flagger.
Amendment 1298 #
Proposal for a regulation
Article 19 – paragraph 4
Article 19 – paragraph 4
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. Notices referred to in paragraph 1 of this Article shall be proceeded with priority with respect to the geographical scope of the trusted flagger, according to awarding of the status by Member States.
Amendment 1300 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
Article 19 – paragraph 4 a (new)
Amendment 1330 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shallProviders of hosting services may suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 1475 #
Proposal for a regulation
Article 23 – paragraph 2
Article 23 – paragraph 2
2. Online platforms shall publish, at least once every six months, information on the average monthly active recipientend users of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
Amendment 1533 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide for at least four consecutive months their services to a number of average monthly active recipientend users of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
Amendment 1535 #
Proposal for a regulation
Article 25 – paragraph 3
Article 25 – paragraph 3
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipientend users of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipientend users of the service in the Union, taking into account different accessibility features.
Amendment 1538 #
Proposal for a regulation
Article 25 – paragraph 4 – subparagraph 1
Article 25 – paragraph 4 – subparagraph 1
The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipientend users of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission.
Amendment 1539 #
Proposal for a regulation
Article 25 – paragraph 4 a (new)
Article 25 – paragraph 4 a (new)
4a. After receiving the decision about the designation as a very large online platform, the online platform may appeal this decision before the Digital Services Coordinator issuing the designation within 60 days. The Digital Services Coordinator may consult the Board. The Digital Services Coordinator shall especially consider the following information while assessing the appeal: (a) the type of content usually shared and the type of the active end user on a given online platform; (b) the exposure to the illegal content as reported under Article 23 and measures taken to mitigate the risks by the online platform; and (c) the exposure to the systemic risks as referred to in Article 26. The Digital Services Coordinator shall decide on the appeal within 60 days. The Digital Services Coordinator may repeatedly initiate this procedure when deemed necessary, after accepting the appeal.
Amendment 1541 #
Proposal for a regulation
Article 25 – paragraph 4 b (new)
Article 25 – paragraph 4 b (new)
4b. The Digital Services Coordinator of establishment may request any online platform to submit a report assessing the dissemination of illegal content through their services, when justified by the information provided in the report submitted in accordance with Article 23. If, after thorough assessment, the Digital Services Coordinator has identified the platform in question as posing significant systemic risks stemming from dissemination of illegal content through their services in the Union, the Digital Services Coordinator may then require proportionate compliance with some or all obligations of Articles 26 to 37.
Amendment 1542 #
Proposal for a regulation
Article 25 – paragraph 4 c (new)
Article 25 – paragraph 4 c (new)
4c. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific methodology for the purpose of paragraph 4a and 4b.
Amendment 1693 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in or any otheir systerms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used used to determine the order of presentation of content, including their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameose which decrease the visibility of content, shall set out in their terms that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in these systems.
Amendment 1696 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1a. The main parameters referred to in paragraph 1 of this Article shall include, at least the following elements: (a) the main criteria used by the relevant recommender system; (b) how these criteria are prioritised; (c) the optimisation goal of the relevant recommender system; and (d) an explanation of the role that the behaviour of the recipients of the service plays in how the relevant recommender system functions.
Amendment 1699 #
Proposal for a regulation
Article 29 – paragraph 1 b (new)
Article 29 – paragraph 1 b (new)
1b. Very large online platforms shall provide options for the recipients of the service to modify or influence parameters referred to in paragraph 2, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
Amendment 1700 #
Proposal for a regulation
Article 29 – paragraph 2
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, vVery large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to; (a) select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them; (b) select third party recommender systems.
Amendment 1849 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shall encouragehave the right to request and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
Amendment 1855 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may inviteshall request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1936 #
Proposal for a regulation
Article 40 – paragraph 3 a (new)
Article 40 – paragraph 3 a (new)
3a. Member State shall have jurisdiction for the purposes of Chapters III and IV of this Regulation where providers online social networking services designated as very large online platforms are concerned, as defined in Article 25 and which offer services to a significant number of active end users of the service in a given Member State which can be calculated on the basis of Article 23(2).
Amendment 1967 #
Proposal for a regulation
Article 43 – paragraph 1
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Assessment of the complaint can be supplemented by the opinion of Digital Services Coordinator of the Member State, where the recipient resides or is established, on how the matter should be resolved taking into account national law and socio-cultural context of a given Member State. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
Amendment 1970 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Article 43 – paragraph 1 a (new)
Pursuant to paragraph 1 of this Article, the Digital Services Coordinator of establishment, in cases concerning a complaint transmitted by the Digital Services Coordinator of the Member State where the recipient resides or is established, shall assess the matter in a timely manner and shall inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled.
Amendment 1976 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
Article 44 – paragraph 2 – point a
(a) the number and subject matter of orders to act against illegal content and orders to provide information, including at least information on the name of the issuing authority, the name of the provider and the type of action specified in the order, issued in accordance with Articles 8, 8a and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;
Amendment 1981 #
Proposal for a regulation
Article 44 – paragraph 2 a (new)
Article 44 – paragraph 2 a (new)
2a. Based on the information published by Digital Services Coordinators, the Commission shall submit to the European Parliament and to the Council a dedicated biennial report analysing the aggregated data on orders referred to in Articles 8, 8a and 9 and issued by the Digital Services Coordinators, with a special attention being paid to potential abusive use of these Articles. The report shall provide a comprehensive overview of the orders to act against illegal content and it shall provide, for a specific period of time, the possibility to assess the activities of Digital Services Coordinators.
Amendment 1982 #
Proposal for a regulation
Article 44 – paragraph 3 a (new)
Article 44 – paragraph 3 a (new)
3a. The Commission shall adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 4 of Regulation (EU) No 182/2011.
Amendment 1988 #
Proposal for a regulation
Article 45 – paragraph 1 a (new)
Article 45 – paragraph 1 a (new)
1a. A request or recommendation pursuant to paragraph 1 of this Article shall not preclude the possibility of Digital Services Coordinator of the Member State where the recipient of the service resides or is established, to be able to carry out its own investigation concerning a suspected infringement of this Regulation by a provider of an intermediary service.
Amendment 1994 #
Proposal for a regulation
Article 45 – paragraph 2 a (new)
Article 45 – paragraph 2 a (new)
2a. A recommendation pursuant to paragraphs 1 and 2 of this Article may additionally indicate: (a) an opinion on matters that involve taking into account national law and socio-cultural context; and (b) a draft decision based on investigation pursuant to paragraph 1a of this Article.
Amendment 2011 #
Proposal for a regulation
Article 45 – paragraph 7
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information shall be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1.
Amendment 2072 #
Proposal for a regulation
Article 48 – paragraph 6
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure, following the consent of and inform the Commission thereof.
Amendment 2086 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
Article 49 – paragraph 1 – point d
(d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measuradopt opinions on issues concerning very large online platforms in accordance with this Regulation;
Amendment 2090 #
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
Article 49 – paragraph 1 – point e a (new)
(ea) issue opinions, recommendations or advice on matters related to Article 34.
Amendment 2141 #
Proposal for a regulation
Article 52 – paragraph 1
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, their legal representatives, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period.
Amendment 2281 #
Proposal for a regulation
Article 69 – paragraph 2
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 13, 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
Amendment 2283 #
Proposal for a regulation
Article 69 – paragraph 3
Article 69 – paragraph 3
3. The delegation of power referred to in Articles 13, 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
Amendment 2286 #
Proposal for a regulation
Article 69 – paragraph 5
Article 69 – paragraph 5
5. A delegated act adopted pursuant to Articles 13, 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of threefour months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.
Amendment 2290 #
Proposal for a regulation
Article 73 – paragraph 1
Article 73 – paragraph 1
1. By fivthree years after the entry into force of this Regulation at the latest, and every fivthree years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation.
Amendment 2291 #
Proposal for a regulation
Article 73 – paragraph 4
Article 73 – paragraph 4
Amendment 2295 #
Proposal for a regulation
Article 74 – paragraph 2
Article 74 – paragraph 2
2. It shall apply from [date - thrsixteen months after its entry into force].