Activities of Eugen JURZYCA related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (continuation of debate)
Amendments (63)
Amendment 229 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and alsunderpin the general idea that what is illegal offline should also be illegal online. The concept should be defined broadly to covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 237 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. Furthermore, cloud services that have no active role in the dissemination, monetisation and organisation of the information to the public or end users, at their request, should not be considered as online platforms.
Amendment 246 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Concept of 'dissemination to the public' should not apply to cloud services, including business-to-business cloud services, with respect to which the service provider has no contractual rights concerning what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no technical capability to remove specific content stored by their customers or the end-users of their services. Where a service provider offers several services, this Regulation should be applied only in respect of the services that fall within its scope. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 255 #
Proposal for a regulation
Recital 16
Recital 16
(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union, as well as technological and market developments.
Amendment 304 #
Proposal for a regulation
Recital 26
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content or that ability originates from the regulatory or contractual provisions, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. Consequently, providers of intermediary services should act on the specific illegal content only if they are in the best place to do so, and the blocking orders should be considered as a last resort measure and applied only when all other options are exhausted.
Amendment 307 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, although they do not fall within the obligations under this Regulations, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 314 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature, imposing constant content identification from the entirety of available content. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
Amendment 378 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Nonetheless, the provider should have the possibility to reject a given notice if there is another entity with more granular control over the alleged content or the provider has no technical capability to act on a specific content. Therefore, the blocking orders should be considered as a last resort measure and applied only when all other options are exhausted. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 452 #
Proposal for a regulation
Recital 51
Recital 51
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipientend users of the service in the Union.
Amendment 468 #
Proposal for a regulation
Recital 53
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipientactive end users of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
Amendment 471 #
Proposal for a regulation
Recital 54
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses may have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipientactive end users exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. In the process of establishing the methodology to calculate the total number of active end users, the Commission should take due account of the different type of platforms and their operations, as well as the potential need for the end user to register, engage in transaction or content in order to be considered as an active end user. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
Amendment 567 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, competition, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 649 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
Article 2 – paragraph 1 – point b a (new)
(ba) 'active end user' means an individual successfully accessing an online interface and having significant interaction with it, its product or service;
Amendment 689 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any specific information,, which, in itself or by its reference to an or activity, including the sale of products or provision of services, which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
Amendment 698 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor andor a purely ancillary feature of another service or functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.
Amendment 710 #
Proposal for a regulation
Article 2 – paragraph 1 – point i
Article 2 – paragraph 1 – point i
(i) ‘dissemination to the public’ means taking an active role in making information available, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties;
Amendment 748 #
Proposal for a regulation
Article 3 – paragraph 1 – introductory part
Article 3 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, or an improvement of the security of that transmission, the service provider shall not be liable for the information transmitted, on condition that the provider:
Amendment 805 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order via a secure communications channel to act against a specific or multiple items of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 839 #
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
Article 8 – paragraph 2 – point c a (new)
(ca) the actor receiving the order has technical and operational ability to act against specific, notified illegal content and has direct control over it.
Amendment 847 #
Proposal for a regulation
Article 8 – paragraph 3 a (new)
Article 8 – paragraph 3 a (new)
3a. The Digital Services Coordinator of each Member State, on its own initiative and within 96 hours of receiving a copy of the order to act through the system developed in accordance with paragraph 4a of this Article, shall have the right to scrutinise the order to determine whether it infringes the respective Member State's law and deem it invalid on its own territory by adopting a reasoned decision.
Amendment 848 #
Proposal for a regulation
Article 8 – paragraph 3 b (new)
Article 8 – paragraph 3 b (new)
3b. Where the Digital Services Coordinator adopts a reasoned decision in accordance with paragraph 3a, (a) the Digital Services Coordinator shall communicate that decision to the authority that issued that order and the concerned provider of the service, and, (b) after receiving a decision finding that the content was not in fact illegal, the concerned provider shall immediately reinstate the content or access thereto in the territory of the Member State of the Digital Services Coordinator who issued the decision.
Amendment 854 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
Article 8 – paragraph 4 a (new)
4a. The Commission shall adopt implementing acts, organising a European information exchange system, allowing for secure communication and authentication of authorised orders between relevant authorities, Digital Services Coordinators and providers, as referred to in Articles 8(1), 8a(1) and 9(1). Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70.
Amendment 856 #
Proposal for a regulation
Article 8 a (new)
Article 8 a (new)
Amendment 861 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order via a secure communications channel to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
Amendment 916 #
Proposal for a regulation
Article 11 – paragraph 4
Article 11 – paragraph 4
4. Providers of intermediary services shall notify identification data, including the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date. The Digital Service Coordinator in the Member State where that legal representative resides or is established shall, upon receiving that information, make reasonable efforts to assess its validity.
Amendment 934 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, plain, intelligible and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 980 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
Article 13 – paragraph 1 – point a
Amendment 986 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action;
Amendment 993 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 1003 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. Following an additional individual risk assessment, the Digital Services Coordinator of establishment may extend the exemption to selected medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 1011 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
Article 13 – paragraph 2 a (new)
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific templates of reports referred to in paragraph 1.
Amendment 1036 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. TNotices submitted under the mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operatoreviewer can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
Amendment 1044 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, and applicable additional information enabling the identification of the illegal content; which shall be appropriate to the type of content and to the specific type of intermediary;
Amendment 1056 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned where there is no doubt as to the illegality of the specific item of content. In case of uncertainty and after taking reasonable steps to assess the illegality of the specific item of content, withholding from removal of the content by the provider shall be perceived as acting in good faith and shall not lead to waiving the liability exemption provided for in Article 5.
Amendment 1164 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(ca) any other decisions that affect the availability, visibility or accessibility of that content or the account of the recipient's access to significant features of the platform's regular services.
Amendment 1254 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
Article 18 – paragraph 6 a (new)
6a. Member States shall establish a mechanism enabling the recipients of the service to contest decisions of out-of-court dispute settlement bodies before a national judicial authority or an administrative authority relevant for resolving disputes related to particular content.
Amendment 1270 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence that could be exercised in one or more Member States for the purposes of detecting, identifying and notifying specific types of illegal content;
Amendment 1298 #
Proposal for a regulation
Article 19 – paragraph 4
Article 19 – paragraph 4
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. Notices referred to in paragraph 1 of this Article shall be proceeded with priority with respect to the geographical scope of the trusted flagger, according to awarding of the status by Member States.
Amendment 1330 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shallProviders of hosting services may suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 1475 #
Proposal for a regulation
Article 23 – paragraph 2
Article 23 – paragraph 2
2. Online platforms shall publish, at least once every six months, information on the average monthly active recipientend users of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
Amendment 1533 #
Proposal for a regulation
Article 25 – paragraph 1
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide for at least four consecutive months their services to a number of average monthly active recipientend users of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
Amendment 1535 #
Proposal for a regulation
Article 25 – paragraph 3
Article 25 – paragraph 3
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipientend users of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipientend users of the service in the Union, taking into account different accessibility features.
Amendment 1538 #
Proposal for a regulation
Article 25 – paragraph 4 – subparagraph 1
Article 25 – paragraph 4 – subparagraph 1
The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipientend users of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission.
Amendment 1539 #
Proposal for a regulation
Article 25 – paragraph 4 a (new)
Article 25 – paragraph 4 a (new)
4a. After receiving the decision about the designation as a very large online platform, the online platform may appeal this decision before the Digital Services Coordinator issuing the designation within 60 days. The Digital Services Coordinator may consult the Board. The Digital Services Coordinator shall especially consider the following information while assessing the appeal: (a) the type of content usually shared and the type of the active end user on a given online platform; (b) the exposure to the illegal content as reported under Article 23 and measures taken to mitigate the risks by the online platform; and (c) the exposure to the systemic risks as referred to in Article 26. The Digital Services Coordinator shall decide on the appeal within 60 days. The Digital Services Coordinator may repeatedly initiate this procedure when deemed necessary, after accepting the appeal.
Amendment 1542 #
Proposal for a regulation
Article 25 – paragraph 4 c (new)
Article 25 – paragraph 4 c (new)
4c. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific methodology for the purpose of paragraph 4a and 4b.
Amendment 1693 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in or any otheir systerms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used used to determine the order of presentation of content, including their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameose which decrease the visibility of content, shall set out in their terms that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in these systems.
Amendment 1696 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1a. The main parameters referred to in paragraph 1 of this Article shall include, at least the following elements: (a) the main criteria used by the relevant recommender system; (b) how these criteria are prioritised; (c) the optimisation goal of the relevant recommender system; and (d) an explanation of the role that the behaviour of the recipients of the service plays in how the relevant recommender system functions.
Amendment 1849 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shall encouragehave the right to request and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
Amendment 1855 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may inviteshall request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1970 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Article 43 – paragraph 1 a (new)
Pursuant to paragraph 1 of this Article, the Digital Services Coordinator of establishment, in cases concerning a complaint transmitted by the Digital Services Coordinator of the Member State where the recipient resides or is established, shall assess the matter in a timely manner and shall inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled.
Amendment 1976 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
Article 44 – paragraph 2 – point a
(a) the number and subject matter of orders to act against illegal content and orders to provide information, including at least information on the name of the issuing authority, the name of the provider and the type of action specified in the order, issued in accordance with Articles 8, 8a and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;
Amendment 1981 #
Proposal for a regulation
Article 44 – paragraph 2 a (new)
Article 44 – paragraph 2 a (new)
2a. Based on the information published by Digital Services Coordinators, the Commission shall submit to the European Parliament and to the Council a dedicated biennial report analysing the aggregated data on orders referred to in Articles 8, 8a and 9 and issued by the Digital Services Coordinators, with a special attention being paid to potential abusive use of these Articles. The report shall provide a comprehensive overview of the orders to act against illegal content and it shall provide, for a specific period of time, the possibility to assess the activities of Digital Services Coordinators.
Amendment 1982 #
Proposal for a regulation
Article 44 – paragraph 3 a (new)
Article 44 – paragraph 3 a (new)
3a. The Commission shall adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 4 of Regulation (EU) No 182/2011.
Amendment 1988 #
Proposal for a regulation
Article 45 – paragraph 1 a (new)
Article 45 – paragraph 1 a (new)
1a. A request or recommendation pursuant to paragraph 1 of this Article shall not preclude the possibility of Digital Services Coordinator of the Member State where the recipient of the service resides or is established, to be able to carry out its own investigation concerning a suspected infringement of this Regulation by a provider of an intermediary service.
Amendment 1994 #
Proposal for a regulation
Article 45 – paragraph 2 a (new)
Article 45 – paragraph 2 a (new)
2a. A recommendation pursuant to paragraphs 1 and 2 of this Article may additionally indicate: (a) an opinion on matters that involve taking into account national law and socio-cultural context; and (b) a draft decision based on investigation pursuant to paragraph 1a of this Article.
Amendment 2011 #
Proposal for a regulation
Article 45 – paragraph 7
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information shall be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1.
Amendment 2072 #
Proposal for a regulation
Article 48 – paragraph 6
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure, following the consent of and inform the Commission thereof.
Amendment 2086 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
Article 49 – paragraph 1 – point d
(d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measuradopt opinions on issues concerning very large online platforms in accordance with this Regulation;
Amendment 2090 #
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
Article 49 – paragraph 1 – point e a (new)
(ea) issue opinions, recommendations or advice on matters related to Article 34.
Amendment 2281 #
Proposal for a regulation
Article 69 – paragraph 2
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 13, 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
Amendment 2283 #
Proposal for a regulation
Article 69 – paragraph 3
Article 69 – paragraph 3
3. The delegation of power referred to in Articles 13, 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
Amendment 2286 #
Proposal for a regulation
Article 69 – paragraph 5
Article 69 – paragraph 5
5. A delegated act adopted pursuant to Articles 13, 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of threefour months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.
Amendment 2295 #
Proposal for a regulation
Article 74 – paragraph 2
Article 74 – paragraph 2
2. It shall apply from [date - thrsixteen months after its entry into force].