BETA

217 Amendments of Kosma ZŁOTOWSKI related to 2020/0361(COD)

Amendment 16 #
Proposal for a regulation
Recital 6
(6) In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport of persons and goods, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union.
2021/06/01
Committee: TRAN
Amendment 20 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activitiesbased on the general idea that what is illegal offline should also be illegal online. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, offering services that require a licence or approval from a competent national authority without having the appropriate credentials, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/06/01
Committee: TRAN
Amendment 46 #
Proposal for a regulation
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade and transport and tourism services. The way they design their services is generally optimised to benefit their often advertising- driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
2021/06/01
Committee: TRAN
Amendment 76 #
Proposal for a regulation
Article 2 – paragraph 1 – point c
(c) ‘consumer’ means any natural person who is acting for purposes which are outsidenot connected to his or her trade, business, craft or profession;
2021/06/01
Committee: TRAN
Amendment 77 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
— a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service, unless this activity is an ancillary and additional feature of another service which is not an information society service and cannot, for objective or technical reasons, be provided independently of it;
2021/06/01
Committee: TRAN
Amendment 119 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and alsunderpin the general idea that what is illegal offline should also be illegal online. The concept should be defined broadly to covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/20
Committee: JURI
Amendment 124 #
Proposal for a regulation
Article 10 – paragraph 2
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact, including their postal address, and shall ensure that this information is kept up to date. Intermediary service providers shall provide the Digital Services Coordinator in the Member State where they are established with their contact details, including the name, postal address, e-mail address and telephone number of their single point of contact.
2021/06/01
Committee: TRAN
Amendment 127 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. Furthermore, cloud services that have no active role in the dissemination, monetisation and organisation of the information to the public or end users, at their request, should not be considered as online platforms.
2021/07/20
Committee: JURI
Amendment 131 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and, unambiguous, straightforward and understandable language and shall be publicly available in an easily accessible format.
2021/06/01
Committee: TRAN
Amendment 134 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Concept of 'dissemination to the public' should not apply to cloud services, including business-to-business cloud services, with respect to which the service provider has no contractual rights concerning what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no technical capability to remove specific content stored by their customers or the end-users of their services. Where a service provider offers several services, this Regulation should be applied only in respect of the services that fall within its scope. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/20
Committee: JURI
Amendment 139 #
Proposal for a regulation
Recital 16
(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union, as well as technological and market developments.
2021/07/20
Committee: JURI
Amendment 146 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(c a) any other decision that affects the availability, visibility or accessibility of the content and/or of the recipient's account or the recipient's access to relevant platform services and features.
2021/06/01
Committee: TRAN
Amendment 146 #
Proposal for a regulation
Recital 20
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally andor the main purpose of which is to engage in or facilitate such activities should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
2021/07/20
Committee: JURI
Amendment 147 #
Proposal for a regulation
Recital 21
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature, such as network management, which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.
2021/07/20
Committee: JURI
Amendment 161 #
Proposal for a regulation
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content or that ability originates from the regulatory or contractual provisions, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. Consequently, providers of intermediary services should act on the specific illegal content only if they are in the best place to do so, and the blocking orders should be considered as a last resort measure and applied only when all other options are exhausted.
2021/07/20
Committee: JURI
Amendment 163 #
Proposal for a regulation
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, although they do not fall within the obligations under this Regulations, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
2021/07/20
Committee: JURI
Amendment 167 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature, imposing constant content identification from the entirety of available content. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
2021/07/20
Committee: JURI
Amendment 198 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Nonetheless, the provider should have the possibility to reject a given notice if there is another entity with more granular control over the alleged content or the provider has no technical capability to act on a specific content. Therefore, the blocking orders should be considered as a last resort measure and applied only when all other options are exhausted. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
2021/07/20
Committee: JURI
Amendment 209 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems or other systems to prioritise content, including reducing the visibility of content, shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/06/01
Committee: TRAN
Amendment 229 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and alsunderpin the general idea that what is illegal offline should also be illegal online. The concept should be defined broadly to covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/08
Committee: IMCO
Amendment 237 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. Furthermore, cloud services that have no active role in the dissemination, monetisation and organisation of the information to the public or end users, at their request, should not be considered as online platforms.
2021/07/08
Committee: IMCO
Amendment 243 #
Proposal for a regulation
Recital 51
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipientend users of the service in the Union.
2021/07/20
Committee: JURI
Amendment 244 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The parameters shall include, if applicable, the optimisation goal selected by the advertiser, information on the use of custom lists, information on the use of lookalike audiences and in such case – relevant information on the seed audience and an explanation why the recipient of the advertisement has been determined to be part of the lookalike audience, meaningful information about the online platform’s algorithms or other tools used to optimise the delivery of the advertisement, including a specification of the optimisation goal and a meaningful explanation of reasons why the online platform has decided that the optimisation goal can be achieved by displaying the advertisement to this recipient. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/20
Committee: JURI
Amendment 246 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Concept of 'dissemination to the public' should not apply to cloud services, including business-to-business cloud services, with respect to which the service provider has no contractual rights concerning what content is stored or how it is processed or made publicly available by its customers or by the end-users of such customers, and where the service provider has no technical capability to remove specific content stored by their customers or the end-users of their services. Where a service provider offers several services, this Regulation should be applied only in respect of the services that fall within its scope. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/08
Committee: IMCO
Amendment 250 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipientactive end users of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
2021/07/20
Committee: JURI
Amendment 253 #
Proposal for a regulation
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses may have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipientactive end users exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. In the process of establishing the methodology to calculate the total number of active end users, the Commission should take due account of the different type of platforms and their operations, as well as the potential need for the end user to register, engage in transaction or content in order to be considered as an active end user. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
2021/07/20
Committee: JURI
Amendment 255 #
Proposal for a regulation
Recital 16
(16) The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union, as well as technological and market developments.
2021/07/08
Committee: IMCO
Amendment 266 #
Proposal for a regulation
Article 54 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the competent authorities of the Member State may, at the request of the Commission may, conduct on-site inspections within the meaning of Article 12 of Regulation (EU) No 139/2004 at the premises of the very large online platform concerned or other person referred to in Article 52(1).
2021/06/01
Committee: TRAN
Amendment 266 #
Proposal for a regulation
Recital 20
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally andor the main purpose of which is to engage in or facilitate such activities should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
2021/07/08
Committee: IMCO
Amendment 266 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms mayshould reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They mayshould also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/20
Committee: JURI
Amendment 267 #
Proposal for a regulation
Article 54 – paragraph 3
3. During on-site inspections the Ccommission and auditors or experts appointed by itpetent authorities of the Member States may require the very large online platform concerned or other person referred to in Article 52(1) to provide explanations on its organisation, functioning, IT system, algorithms, data- handling and business conducts. The Commission and auditors or experts appointed by it may address questions to key personnel of the very large online platform concerned or other person referred to in Article 52(1).
2021/06/01
Committee: TRAN
Amendment 268 #
Proposal for a regulation
Article 54 – paragraph 4
4. The very large online platform concerned or other person referred to in Article 52(1) is required to submit to an on-site inspection ordered by decisionat the request of the Commission. The decisionrequest shall specify the subject matter and purpose of the visit, set the date on which it is to begin and indicate the penalties provided for in Articles 59 and 60 and the right to have the decision reviewed by the Court of Justice of the European Union.
2021/06/01
Committee: TRAN
Amendment 270 #
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 510 % of the average daily turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to:
2021/06/01
Committee: TRAN
Amendment 270 #
Proposal for a regulation
Recital 21
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature, such as network management, which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.
2021/07/08
Committee: IMCO
Amendment 304 #
Proposal for a regulation
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content or that ability originates from the regulatory or contractual provisions, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. Consequently, providers of intermediary services should act on the specific illegal content only if they are in the best place to do so, and the blocking orders should be considered as a last resort measure and applied only when all other options are exhausted.
2021/07/08
Committee: IMCO
Amendment 307 #
Proposal for a regulation
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, although they do not fall within the obligations under this Regulations, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
2021/07/08
Committee: IMCO
Amendment 313 #
Proposal for a regulation
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. In addition, in order to ensure effective protection of the rights of Union citizens that take into account diverse national laws and difference in socio-cultural context between countries, a Member State should exercise jurisdiction where it concerns online social networking services provided by very large online platforms which offer services to a significant number of recipients in a given Member State. Member States jurisdiction is particularly important in case of very large online platforms which are social networks because they play a central role in facilitating the public debate.
2021/07/19
Committee: JURI
Amendment 314 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature, imposing constant content identification from the entirety of available content. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
2021/07/08
Committee: IMCO
Amendment 327 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, competition, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/07/19
Committee: JURI
Amendment 369 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
(b a) 'active end user' means an individual successfully accessing an online interface and having significant interaction with it, its product or service;
2021/07/19
Committee: JURI
Amendment 370 #
Proposal for a regulation
Article 2 – paragraph 1 – point c
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business craft or profession;
2021/07/19
Committee: JURI
Amendment 377 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
— a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service, unless this activity is an ancillary and additional feature of another service which is not an information society service and cannot, for objective or technical reasons, be provided independently of it;
2021/07/19
Committee: JURI
Amendment 378 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Nonetheless, the provider should have the possibility to reject a given notice if there is another entity with more granular control over the alleged content or the provider has no technical capability to act on a specific content. Therefore, the blocking orders should be considered as a last resort measure and applied only when all other options are exhausted. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
2021/07/08
Committee: IMCO
Amendment 386 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any specific information,, which, in itself or by its reference to an or activity, including the sale of products or provision of services, which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
2021/07/19
Committee: JURI
Amendment 387 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor andor a purely ancillary feature of another service or functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.
2021/07/19
Committee: JURI
Amendment 390 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(h a) ‘online social networking service’ means a platform that enables end users to connect, share, discover and communicate with each other across multiple devices and, in particular, via chats, posts, videos and recommendations;
2021/07/19
Committee: JURI
Amendment 393 #
Proposal for a regulation
Article 2 – paragraph 1 – point h b (new)
(h b) ‘editorial platform’ means an intermediary service which is in connection with a press publication within the meaning of Article 2(4) of Directive (EU) 2019/790 or another editorial media service and which allows users to discuss topics generally covered by the relevant media or to comment editorial content and which is under the supervision of the editorial team of the publication or other editorial media.
2021/07/19
Committee: JURI
Amendment 394 #
Proposal for a regulation
Article 2 – paragraph 1 – point i
(i) ‘dissemination to the public’ means taking an active role in making information available, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties;
2021/07/19
Committee: JURI
Amendment 413 #
Proposal for a regulation
Article 3 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, or an improvement of the security of that transmission, the service provider shall not be liable for the information transmitted, on condition that the provider:
2021/07/19
Committee: JURI
Amendment 415 #
Proposal for a regulation
Article 3 – paragraph 3
3. This Article shall not affect the possibility for a court or functionally independent administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
2021/07/19
Committee: JURI
Amendment 446 #
Proposal for a regulation
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order via a secure communications channel to act against a specific or multiple items of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
2021/07/19
Committee: JURI
Amendment 452 #
Proposal for a regulation
Recital 51
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipientend users of the service in the Union.
2021/07/08
Committee: IMCO
Amendment 454 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The parameters should include, if applicable, the optimisation goal selected by the advertiser, information on the use of custom lists, information on the use of lookalike audiences and in such case – relevant information on the seed audience and an explanation why the recipient of the advertisement has been determined to be part of the lookalike audience, meaningful information about the online platform’s algorithms or other tools used to optimise the delivery of the advertisement, including a specification of the optimisation goal and a meaningful explanation of reasons why the online platform has decided that the optimisation goal can be achieved by displaying the advertisement to this recipient. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/08
Committee: IMCO
Amendment 466 #
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
(c a) the actor receiving the order has technical and operational ability to act against specific, notified illegal content and has direct control over it.
2021/07/19
Committee: JURI
Amendment 468 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipientactive end users of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
2021/07/08
Committee: IMCO
Amendment 471 #
Proposal for a regulation
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses may have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipientactive end users exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. In the process of establishing the methodology to calculate the total number of active end users, the Commission should take due account of the different type of platforms and their operations, as well as the potential need for the end user to register, engage in transaction or content in order to be considered as an active end user. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
2021/07/08
Committee: IMCO
Amendment 475 #
Proposal for a regulation
Article 8 – paragraph 3 a (new)
3 a. The Digital Services Coordinator of each Member State, on its own initiative and within 96 hours of receiving a copy of the order to act through the system developed in accordance with paragraph 4a of this Article, shall have the right to scrutinise the order to determine whether it infringes the respective Member State's law and deem it invalid on its own territory by adopting a reasoned decision.
2021/07/19
Committee: JURI
Amendment 477 #
Proposal for a regulation
Article 8 – paragraph 3 b (new)
3 b. Where the Digital Services Coordinator adopts a reasoned decision in accordance with paragraph 3a, a) the Digital Services Coordinator shall communicate that decision to the authority that issued that order and the concerned provider of the service, and, b) after receiving a decision finding that the content was not infact illegal, the concerned provider shall immediately reinstate the content or access thereto in the territory of the Member State of the Digital Services Coordinator who issued the decision.
2021/07/19
Committee: JURI
Amendment 480 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
4 a. The Commission shall adopt implementing acts, organising a European information exchange system, allowing for secure communication and authentication of authorised orders between relevant authorities, Digital Services Coordinators and providers, as referred to in Articles 8(1), 8a(1) and 9(1). Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70.
2021/07/19
Committee: JURI
Amendment 484 #
Proposal for a regulation
Article 8 a (new)
Article 8 a Orders to restore lawful content 1. Providers of intermediary services shall, upon the receipt of an order via a secure communications channel to restore a specific item or multiple items of removed content, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders without undue delay, specifying the action taken and the moment when the action was taken. 2. Member States shall ensure that the orders referred to in paragraph 1 meet the following conditions: (a) the orders contain the following elements: (i) a statement of reasons explaining why the content in question is legal, by reference to the specific provision of Union or national law or court ruling; (ii) one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the legal content concerned; (iii) information about redress available to the provider of the service who removed the content and to the recipient of the service who notified the content; (b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; and (c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10.
2021/07/19
Committee: JURI
Amendment 487 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms mayshould reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They mayshould also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/08
Committee: IMCO
Amendment 487 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order via a secure communications channel to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
2021/07/19
Committee: JURI
Amendment 522 #
Proposal for a regulation
Article 10 – paragraph 2
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact and ensure that information is up to date. Providers of intermediary services shall notify that information, including the name, postal address, the electronic mail address and telephone number of their single point of contact, to the Digital Services Coordinator in the Member State where they are established.
2021/07/19
Committee: JURI
Amendment 524 #
Proposal for a regulation
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in at least one of the Member States where the provider offers its services. Very large online platforms shall designate a legal representative in each of the Member States where the provider offers its services.
2021/07/19
Committee: JURI
Amendment 529 #
Proposal for a regulation
Article 11 – paragraph 4
4. Providers of intermediary services shall notify identification data, including the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date. The Digital Service Coordinator in the Member State where that legal representative resides or is established shall, upon receiving that information, make reasonable efforts to assess its validity.
2021/07/19
Committee: JURI
Amendment 530 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
5 a. Providers of online social networking services designated as very large online platform according to Article 25 shall designate a legal representative to be bound to obligations laid down in this Article at the request of the Digital Services Coordinator of the Member States where this provider offers its services.
2021/07/19
Committee: JURI
Amendment 532 #
Proposal for a regulation
Article 11 a (new)
Article 11 a Exclusions Articles 12 and 13 of Section 1, and the provisions of Section 2, and Section 3 of Chapter III shall not apply to: (a) editorial platforms within the meaning of Article 2(h a) of this Regulation; (b) online platforms that qualify as micro and medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC; (c) an intermediary service, except very large online platforms, where it would constitute a disproportionate burden in view of its size, the nature of its activity and the risk posed to users.
2021/07/19
Committee: JURI
Amendment 538 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, plain, intelligible and unambiguous language and shall be publicly available in an easily accessible format.
2021/07/19
Committee: JURI
Amendment 539 #
Proposal for a regulation
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. In addition in order to ensure effective protection of rights of EU citizens that take into account diverse national laws and difference in socio- cultural context between countries, a Member State should exercise jurisdiction where it concerns online social networking services provided by very large online platforms which offer services to a significant number of recipients in a given Member State. Member States jurisdiction is particularly important in case of very large online platforms which are social networks because they play a central role in facilitating the public debate.
2021/07/08
Committee: IMCO
Amendment 549 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Providers designated as very large online platforms as referred to in Article 25, shall publish their terms and conditions in all official languages of the Union.
2021/07/19
Committee: JURI
Amendment 550 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2b. The Digital Services Coordinator of each Member State has the right to request very large online platforms, to apply measures and tools of content moderation, including algorithmic decision-making and human review reflecting Member State’s socio-cultural context. The framework for this cooperation as well as specific measures related thereto may be laid down in national legislation and shall be notified to the Commission.
2021/07/19
Committee: JURI
Amendment 553 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2c. The Digital Services Coordinator of each Member State, by means of national legislation, may request a very large online platform to cooperate with the Digital Services Coordinator of the Member State in question in handling cases involving the removal of lawful content online that is taken down erroneously if there is reason to believe that the Member State’s socio-cultural context may have played a vital role.
2021/07/19
Committee: JURI
Amendment 557 #
Proposal for a regulation
Article 12 – paragraph 2 d (new)
2d. Where very large online platforms within the meaning of Article 25 of this Regulation otherwise allow for the dissemination to the public of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790, such platforms shall not remove, disable access to, suspend or otherwise interfere with such content or the related service or suspend or terminate the related account on the basis of the alleged incompatibility of such content with its terms and conditions.
2021/07/19
Committee: JURI
Amendment 562 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders;deleted
2021/07/19
Committee: JURI
Amendment 567 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, competition, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/07/08
Committee: IMCO
Amendment 567 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action;
2021/07/19
Committee: JURI
Amendment 571 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/07/19
Committee: JURI
Amendment 576 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. Following an additional individual risk assessment, the Digital Services Coordinator of establishment may extend the exemption to selected medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/07/19
Committee: JURI
Amendment 580 #
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific templates of reports referred to in paragraph 1.
2021/07/19
Committee: JURI
Amendment 591 #
2. TNotices submitted under the mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operatoreviewer can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
2021/07/19
Committee: JURI
Amendment 595 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, and applicable additional information enabling the identification of the illegal content which shall be appropriate to the type of content and to the specific type of intermediary;
2021/07/19
Committee: JURI
Amendment 604 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. where there is no doubt as to the illegality of the specific item of content. In case of uncertainty and after taking reasonable steps to assess the illegality of the specific item of content, withholding from removal of the content by the provider shall be perceived as acting in good faith and should not lead to waiving the liability exemption provided for in Article 5.
2021/07/19
Committee: JURI
Amendment 613 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. Where the provider has no technical, operational or contractual ability to act against specific items of illegal content, it may hand over a notice to the provider that has direct control of specific items of illegal content, while informing the notifying person or entity and the relevant Digital Services Coordinator.
2021/07/19
Committee: JURI
Amendment 627 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access, or otherwise limit the availability, visibility or accessibility to specific items of information, provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
2021/07/19
Committee: JURI
Amendment 644 #
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. Following an additional, individual risk assessment, the Digital Services Coordinator of establishment may extend the exemption to selected medium-sized enterprises.
2021/07/19
Committee: JURI
Amendment 649 #
Proposal for a regulation
Article 2 – paragraph 1 – point b a (new)
(ba) 'active end user' means an individual successfully accessing an online interface and having significant interaction with it, its product or service;
2021/07/08
Committee: IMCO
Amendment 661 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) any other decisions that affect the availability, visibility or accessibility of that content or the account of the recipient's access to significant features of the platform's regular services.
2021/07/19
Committee: JURI
Amendment 672 #
Proposal for a regulation
Article 17 – paragraph 4
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. The decision mentioned in this paragraph shall also include: – information on whether the decision referred to in paragraph 1 was taken as a result of human review or through automated means; – in case the decision referred to in paragraph 1 is upheld, a detailed explanation on how the information to which the complaint relates to is in breach of the platform’s terms and conditions or why the online platform considers the information to be unlawful.
2021/07/19
Committee: JURI
Amendment 674 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
— a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service, unless this activity is an ancillary and additional feature of another service which is not an information society service and cannot, for objective or technical reasons, be provided independently of it;
2021/07/08
Committee: IMCO
Amendment 674 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. Complainants shall have the right to request human review and consultation with relevant online platforms’ staff with respect to content to which the complaint relates to.
2021/07/19
Committee: JURI
Amendment 676 #
Proposal for a regulation
Article 17 – paragraph 5 a (new)
5a. Recipients of the service negatively affected by the decision of an online platform shall have the possibility to seek swift judicial redress in accordance with the laws of the Member States concerned. The procedure shall ensure that an independent judicial authority decides on the matter without undue delay, reaching a decision within 14 working days while granting the negatively affected party the right to seek interim measures to be imposed within 48 hours from when their redress is brought before this judicial authority. The rights to seek judicial redress and to obtain interim measures shall not be limited or subjected to the condition of exhausting the internal complaint-handling system.
2021/07/19
Committee: JURI
Amendment 689 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any specific information,, which, in itself or by its reference to an or activity, including the sale of products or provision of services, which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
2021/07/08
Committee: IMCO
Amendment 698 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor andor a purely ancillary feature of another service or functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.
2021/07/08
Committee: IMCO
Amendment 702 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
6a. Member States shall establish a mechanism enabling the recipients of the service to contest decisions of out-of-court dispute settlement bodies before a national judicial authority or an administrative authority relevant for resolving disputes related to particular content.
2021/07/19
Committee: JURI
Amendment 703 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(ha) ‘online social networking service’ means a platform that enables end users to connect, share, discover and communicate with each other across multiple devices and, in particular, via chats, posts, videos and recommendations;
2021/07/08
Committee: IMCO
Amendment 708 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by competent trusted flaggers, addressing allegedly illegal content that can seriously affect public security, policy or consumers' health or safety through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
2021/07/19
Committee: JURI
Amendment 710 #
Proposal for a regulation
Article 2 – paragraph 1 – point i
(i) ‘dissemination to the public’ means taking an active role in making information available, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties;
2021/07/08
Committee: IMCO
Amendment 713 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
(ia) ‘online marketplace’ means an online platform which facilitates traders to access consumers, advertises their offer and redirects to their profile or website, regardless of whether the transaction is finalised on the platform or outside the platform;
2021/07/08
Committee: IMCO
Amendment 716 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence that could be exercised in one or more Member States for the purposes of detecting, identifying and notifying specific types of illegal content;
2021/07/19
Committee: JURI
Amendment 730 #
Proposal for a regulation
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. This communication shall include the geographical scope within which the trusted flagger competence was recognised based on the approval of a particular Digital Services Coordinator and information on expertise and competence declared by the trusted flagger.
2021/07/19
Committee: JURI
Amendment 732 #
Proposal for a regulation
Article 19 – paragraph 4
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. Notices referred to in paragraph 1 of this Article shall be proceeded with priority with respect to the geographical scope of the trusted flagger, according to awarding of the status by Member States.
2021/07/19
Committee: JURI
Amendment 733 #
4a. Trusted flaggers shall provide the Digital Services Coordinator of establishment with clear and accessible reports on notices they sent during the relevant period, at least once every three years. Those reports shall include information on: (a) the number of notices submitted in accordance with Article 14, categorised by the type of presumed illegal content concerned; (b) the number and percentage of notices that led to the removal or suspension of the content concerned; and (c) the number of notices that were considered to be insufficiently precise or inadequately substantiated by the online platforms.
2021/07/19
Committee: JURI
Amendment 748 #
Proposal for a regulation
Article 3 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, or an improvement of the security of that transmission, the service provider shall not be liable for the information transmitted, on condition that the provider:
2021/07/08
Committee: IMCO
Amendment 750 #
Proposal for a regulation
Article 3 – paragraph 3
3. This Article shall not affect the possibility for a court or functionally independent administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
2021/07/08
Committee: IMCO
Amendment 754 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platforms shallProviders of hosting services may suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
2021/07/19
Committee: JURI
Amendment 805 #
Proposal for a regulation
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order via a secure communications channel to act against a specific or multiple items of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
2021/07/08
Committee: IMCO
Amendment 818 #
Proposal for a regulation
Article 23 – paragraph 2
2. Online platforms shall publish, at least once every six months, information on the average monthly active recipientend users of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
2021/07/19
Committee: JURI
Amendment 839 #
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
(ca) the actor receiving the order has technical and operational ability to act against specific, notified illegal content and has direct control over it.
2021/07/08
Committee: IMCO
Amendment 845 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide for at least four consecutive months their services to a number of average monthly active recipientend users of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
2021/07/19
Committee: JURI
Amendment 847 #
Proposal for a regulation
Article 8 – paragraph 3 a (new)
3a. The Digital Services Coordinator of each Member State, on its own initiative and within 96 hours of receiving a copy of the order to act through the system developed in accordance with paragraph 4a of this Article, shall have the right to scrutinise the order to determine whether it infringes the respective Member State's law and deem it invalid on its own territory by adopting a reasoned decision.
2021/07/08
Committee: IMCO
Amendment 847 #
Proposal for a regulation
Article 25 – paragraph 3
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipientend users of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipientend users of the service in the Union, taking into account different accessibility features.
2021/07/19
Committee: JURI
Amendment 848 #
Proposal for a regulation
Article 8 – paragraph 3 b (new)
3b. Where the Digital Services Coordinator adopts a reasoned decision in accordance with paragraph 3a, (a) the Digital Services Coordinator shall communicate that decision to the authority that issued that order and the concerned provider of the service, and, (b) after receiving a decision finding that the content was not in fact illegal, the concerned provider shall immediately reinstate the content or access thereto in the territory of the Member State of the Digital Services Coordinator who issued the decision.
2021/07/08
Committee: IMCO
Amendment 849 #
Proposal for a regulation
Article 25 – paragraph 4 – introductory part
4. The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipientend users of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission.
2021/07/19
Committee: JURI
Amendment 850 #
Proposal for a regulation
Article 25 – paragraph 4 a (new)
4a. After receiving the decision about the designation as a very large online platform, the online platform may appeal this decision before the Digital Services Coordinator issuing the designation within 60 days. The Digital Services Coordinator may consult the Board. The Digital Services Coordinator shall especially consider the following information while assessing the appeal: (a) the type of content usually shared and the type of the active end user on a given online platform; (b) the exposure to the illegal content as reported under Article 23 and measures taken to mitigate the risks by the online platform; and (c) the exposure to the systemic risks as referred to in Article 26. The Digital Services Coordinator shall decide on the appeal within 60 days. The Digital Services Coordinator may repeatedly initiate this procedure when deemed necessary, after accepting the appeal.
2021/07/19
Committee: JURI
Amendment 851 #
Proposal for a regulation
Article 25 – paragraph 4 b (new)
4b. The Digital Services Coordinator of establishment may request any online platform to submit a report assessing the dissemination of illegal content through their services, when justified by the information provided in the report submitted in accordance with Article 23. If, after thorough assessment, the Digital Services Coordinator has identified the platform in question as posing significant systemic risks stemming from dissemination of illegal content through their services in the Union, the Digital Services Coordinator may then require proportionate compliance with some or all obligations of Articles 26 to 37.
2021/07/19
Committee: JURI
Amendment 852 #
Proposal for a regulation
Article 25 – paragraph 4 c (new)
4c. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific methodology for the purpose of paragraph 4a and 4b.
2021/07/19
Committee: JURI
Amendment 854 #
Proposal for a regulation
Article 8 – paragraph 4 a (new)
4a. The Commission shall adopt implementing acts, organising a European information exchange system, allowing for secure communication and authentication of authorised orders between relevant authorities, Digital Services Coordinators and providers, as referred to in Articles 8(1), 8a(1) and 9(1). Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70.
2021/07/08
Committee: IMCO
Amendment 856 #
Proposal for a regulation
Article 8 a (new)
Article 8a Orders to restore lawful content 1. Providers of intermediary services shall, upon the receipt of an order via a secure communications channel to restore a specific item or multiple items of removed content, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders without undue delay, specifying the action taken and the moment when the action was taken. 2. Member States shall ensure that the orders referred to in paragraph 1 meet the following conditions: (a) the orders contain the following elements: (i) a statement of reasons explaining why the content in question is legal, by reference to the specific provision of Union or national law or court ruling; (ii) one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the legal content concerned; (iii) information about redress available to the provider of the service who removed the content and to the recipient of the service who notified the content; (b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; and (c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10.
2021/07/08
Committee: IMCO
Amendment 861 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order via a secure communications channel to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
2021/07/08
Committee: IMCO
Amendment 867 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom and pluralism of the media, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
2021/07/19
Committee: JURI
Amendment 901 #
Proposal for a regulation
Article 10 – paragraph 2
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact and ensure that information is up to date. Providers of intermediary services shall notify that information, including the name, postal address, the electronic mail address and telephone number of their single point of contact, to the Digital Services Coordinator in the Member State where they are established.
2021/07/08
Committee: IMCO
Amendment 910 #
Proposal for a regulation
Article 27 a (new)
Article 27a Mitigation of risks for the freedom of expression and freedom and pluralism of the media 1. Very large online platforms shall ensure that the exercise of the fundamental rights of freedom of expression and freedom and pluralism of the media is always adequately and effectively protected. 2. Where very large online platforms allow for the dissemination of press publications within the meaning of Art. 2(4) of Directive (EU) 2019/790, of audiovisual media services within the meaning of Article 1(1)(a) of Directive 2010/13/EU (AVMS) or of other editorial media, which are published in compliance with applicable Union and national law under the editorial responsibility and control of a press publisher, audiovisual or other media service provider, who can be held liable under the laws of a Member State, the platforms shall be prohibited from removing, disabling access to, suspending or otherwise interfering with such content or services or suspending or terminating the service providers’ accounts on the basis of the alleged incompatibility of such content with their terms and conditions[, as well as on the basis of any self-regulatory or co- regulatory standard or measure, including Codes of Conduct pursuant to Article 35 of this Regulation]. [The same shall apply to books and films or other expressions of opinion or statements of fact for the purpose of exercising the right to freedom of expression as enshrined in Article 11 of the Charter.] 3. Very large online platforms shall ensure that their content moderation, their decision-making processes, the features or functioning of their services, their terms and conditions and recommender systems are objective, fair and non-discriminatory.
2021/07/19
Committee: JURI
Amendment 916 #
Proposal for a regulation
Article 11 – paragraph 4
4. Providers of intermediary services shall notify identification data, including the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date. The Digital Service Coordinator in the Member State where that legal representative resides or is established shall, upon receiving that information, make reasonable efforts to assess its validity.
2021/07/08
Committee: IMCO
Amendment 920 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
5a. Providers of online social networking services designated as very large online platform according to Article 25 shall designate a legal representative to be bound to obligations laid down in this Article at the request of the Digital Services Coordinator of the Member States where this provider offers its services.
2021/07/08
Committee: IMCO
Amendment 932 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous, straightforward and understandable language and shall be publicly available in an easily accessible format.
2021/07/08
Committee: IMCO
Amendment 932 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in or any otheir systerms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used used to determine the order of presentation of content, including their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameose which decrease the visibility of content, shall set out in their terms that they may have made available, including at least one option which is not based on profiling, withinand conditions, in a clear, accessible and easily comprehensible manner, the meaning of Article 4 (4) of Regulation (EU) 2016/679ain parameters used in these systems.
2021/07/19
Committee: JURI
Amendment 933 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
1a. The main parameters referred to in paragraph 1 of this Article shall include at least the following elements: (a) the main criteria used by the relevant recommender system; (b) how these criteria are prioritised; (c) the optimisation goal of the relevant recommender system; and (d) an explanation of the role that the behaviour of the recipients of the service plays in how the relevant recommender system functions.
2021/07/19
Committee: JURI
Amendment 934 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, plain, intelligible and unambiguous language and shall be publicly available in an easily accessible format.
2021/07/08
Committee: IMCO
Amendment 936 #
Proposal for a regulation
Article 29 – paragraph 1 b (new)
1b. Very large online platforms shall provide options for the recipients of the service to modify or influence parameters referred to in paragraph 2, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation(EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 937 #
Proposal for a regulation
Article 29 – paragraph 1 c (new)
1c. The parameters used in recommender systems shall always be fair and non-discriminatory.
2021/07/19
Committee: JURI
Amendment 939 #
Proposal for a regulation
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, vVery large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to: (a) select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them; (b) select third party recommender systems.
2021/07/19
Committee: JURI
Amendment 954 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Providers designated as very large online platforms as referred to in Article 25, shall publish their terms and conditions in all official languages of the Union.
2021/07/08
Committee: IMCO
Amendment 958 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2b. The Digital Services Coordinator of each Member State has the right to request very large online platforms to apply measures and tools of content moderation, including algorithmic decision-making and human review reflecting Member State’s socio-cultural context. The framework for this cooperation as well as specific measures related thereto may be laid down in national legislation and shall be notified to the Commission.
2021/07/08
Committee: IMCO
Amendment 961 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2c. The Digital Services Coordinator of each Member State, by means of national legislation, may request a very large online platform to cooperate with the Digital Services Coordinator of the Member State in question in handling cases involving the removal of lawful content online that is taken down erroneously if there is reason to believe that the Member State’s socio-cultural context may have played a vital role.
2021/07/08
Committee: IMCO
Amendment 980 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders;deleted
2021/07/08
Committee: IMCO
Amendment 986 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action;
2021/07/08
Committee: IMCO
Amendment 993 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/07/08
Committee: IMCO
Amendment 1003 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. Following an additional individual risk assessment, the Digital Services Coordinator of establishment may extend the exemption to selected medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/07/08
Committee: IMCO
Amendment 1011 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific templates of reports referred to in paragraph 1.
2021/07/08
Committee: IMCO
Amendment 1014 #
Proposal for a regulation
Article 35 – paragraph 1
1. The Commission and the Board shall encouragehave the right to request and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
2021/07/19
Committee: JURI
Amendment 1018 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may inviteshall request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/19
Committee: JURI
Amendment 1036 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
2. TNotices submitted under the mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operatoreviewer can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
2021/07/08
Committee: IMCO
Amendment 1044 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, and applicable additional information enabling the identification of the illegal content; which shall be appropriate to the type of content and to the specific type of intermediary;
2021/07/08
Committee: IMCO
Amendment 1055 #
Proposal for a regulation
Article 40 – paragraph 3 a (new)
3a. Member States shall have jurisdiction for the purposes of Chapters III and IV of this Regulation where providers online social networking services designated as very large online platforms are concerned, as defined in Article 25 and which offer services to a significant number of active end users of the service in a given Member State which can be calculated on the basis of Article 23(2).
2021/07/19
Committee: JURI
Amendment 1056 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned where there is no doubt as to the illegality of the specific item of content. In case of uncertainty and after taking reasonable steps to assess the illegality of the specific item of content, withholding from removal of the content by the provider shall be perceived as acting in good faith and shall not lead to waiving the liability exemption provided for in Article 5.
2021/07/08
Committee: IMCO
Amendment 1066 #
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Assessment of the complaint can be supplemented by the opinion of Digital Services Coordinator of the Member State, where the recipient resides or is established, on how the matter should be resolved taking into account national law and socio-cultural context of a given Member State. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
2021/07/19
Committee: JURI
Amendment 1070 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. Where the provider has no technical, operational or contractual ability to act against specific items of illegal content, it may hand over a notice to the provider that has direct control of specific items of illegal content, while informing the notifying person or entity and the relevant Digital Services Coordinator.
2021/07/08
Committee: IMCO
Amendment 1070 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Pursuant to paragraph 1 of this Article, the Digital Services Coordinator of establishment, in cases concerning a complaint transmitted by the Digital Services Coordinator of the Member State where the recipient resides or is established, shall assess the matter in a timely manner and shall inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled.
2021/07/19
Committee: JURI
Amendment 1076 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
(a) the number and subject matter of orders to act against illegal content and orders to provide information, including at least information on the name of the issuing authority, the name of the provider and the type of action specified in the order, issued in accordance with Articles 8, 8a and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;
2021/07/19
Committee: JURI
Amendment 1078 #
Proposal for a regulation
Article 44 – paragraph 2 a (new)
2a. Based on the information published by Digital Services Coordinators, the Commission shall submit to the European Parliament and to the Council a dedicated biennial report analysing the aggregated data on orders referred to in Articles 8, 8a and 9 and issued by the Digital Services Coordinators, with a special attention being paid to potential abusive use of these Articles. The report shall provide a comprehensive overview of the orders to act against illegal content and it shall provide, for a specific period of time, the possibility to assess the activities of Digital Services Coordinators.
2021/07/19
Committee: JURI
Amendment 1079 #
Proposal for a regulation
Article 44 – paragraph 3 a (new)
3a. The Commission shall adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 4 of Regulation (EU) No 182/2011.
2021/07/19
Committee: JURI
Amendment 1082 #
Proposal for a regulation
Article 45 – paragraph 1 a (new)
1a. A request or recommendation pursuant to paragraph 1 of this Article shall not preclude the possibility of Digital Services Coordinator of the Member State where the recipient of the service resides or is established, to be able to carry out its own investigation concerning a suspected infringement of this Regulation by a provider of an intermediary service.
2021/07/19
Committee: JURI
Amendment 1085 #
Proposal for a regulation
Article 45 – paragraph 2 a (new)
2a. A recommendation pursuant to paragraphs 1 and 2 of this Article may additionally indicate: (a) an opinion on matters that involve taking into account national law and socio-cultural context; and (b) a draft decision based on investigation pursuant to paragraph 1a of this Article.
2021/07/19
Committee: JURI
Amendment 1096 #
Proposal for a regulation
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information should be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1.
2021/07/19
Committee: JURI
Amendment 1098 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access, or otherwise limit the availability, visibility or accessibility to specific items of information, provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
2021/07/08
Committee: IMCO
Amendment 1104 #
Proposal for a regulation
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure, following the consent of and inform the Commission thereof.
2021/07/19
Committee: JURI
Amendment 1106 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
(d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measuradopt opinions on issues concerning very large online platforms in accordance with this Regulation;
2021/07/19
Committee: JURI
Amendment 1108 #
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
(ea) issue opinions, recommendations or advice on matters related to Article 34.
2021/07/19
Committee: JURI
Amendment 1130 #
Proposal for a regulation
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, their legal representatives, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period.
2021/07/19
Committee: JURI
Amendment 1138 #
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or, small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. Following an additional, individual risk assessment, the Digital Services Coordinator of establishment may extend the exemption to selected medium-sized enterprises.
2021/07/08
Committee: IMCO
Amendment 1145 #
Proposal for a regulation
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 13, 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
2021/07/19
Committee: JURI
Amendment 1146 #
Proposal for a regulation
Article 69 – paragraph 3
3. The delegation of power referred to in Articles 13, 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
2021/07/19
Committee: JURI
Amendment 1147 #
Proposal for a regulation
Article 69 – paragraph 5
5. A delegated act adopted pursuant to Articles 13, 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of threefour months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.
2021/07/19
Committee: JURI
Amendment 1148 #
Proposal for a regulation
Article 73 – paragraph 1
1. By fivthree years after the entry into force of this Regulation at the latest, and every fivthree years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation.
2021/07/19
Committee: JURI
Amendment 1149 #
Proposal for a regulation
Article 73 – paragraph 4
4. By three years from the date of application of this Regulation at the latest, the Commission, after consulting the Board, shall carry out an assessment of the functioning of the Board and shall report it to the European Parliament, the Council and the European Economic and Social Committee, taking into account the first years of application of the Regulation. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation with regard to the structure of the Board.deleted
2021/07/19
Committee: JURI
Amendment 1150 #
Proposal for a regulation
Article 74 – paragraph 2 – introductory part
2. It shall apply from [date - thrsixteen months after its entry into force].
2021/07/19
Committee: JURI
Amendment 1164 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) any other decisions that affect the availability, visibility or accessibility of that content or the account of the recipient's access to significant features of the platform's regular services.
2021/07/08
Committee: IMCO
Amendment 1186 #
Proposal for a regulation
Article 17 – paragraph 4
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. The decision mentioned in this paragraph shall also include: - information on whether the decision referred to in paragraph 1 was taken as a result of human review or through automated means; - in case the decision referred to in paragraph 1 is upheld, a detailed explanation on how the information to which the complaint relates to is in breach of the platform’s terms and conditions or why the online platform considers the information to be unlawful.
2021/07/08
Committee: IMCO
Amendment 1192 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. Complainants shall have the right to request human review and consultation with relevant online platforms’ staff with respect to content to which the complaint relates to.
2021/07/08
Committee: IMCO
Amendment 1197 #
Proposal for a regulation
Article 17 – paragraph 5 a (new)
5a. Recipients of the service negatively affected by the decision of an online platform shall have the possibility to seek swift judicial redress in accordance with the laws of the Member States concerned. The procedure shall ensure that an independent judicial authority decides on the matter without undue delay, reaching a decision within 14 working days while granting the negatively affected party the right to seek interim measures to be imposed within 48 hours from when their redress is brought before this judicial authority. The rights to seek judicial redress and to obtain interim measures shall not be limited or subjected to the condition of exhausting the internal complaint-handling system.
2021/07/08
Committee: IMCO
Amendment 1254 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
6a. Member States shall establish a mechanism enabling the recipients of the service to contest decisions of out-of-court dispute settlement bodies before a national judicial authority or an administrative authority relevant for resolving disputes related to particular content.
2021/07/08
Committee: IMCO
Amendment 1261 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by competent trusted flaggers, addressing allegedly illegal content that can seriously affect public security, policy or consumers' health or safety through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
2021/07/08
Committee: IMCO
Amendment 1270 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence that could be exercised in one or more Member States for the purposes of detecting, identifying and notifying specific types of illegal content;
2021/07/08
Committee: IMCO
Amendment 1294 #
Proposal for a regulation
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. This communication shall include the geographical scope within which the trusted flagger competence was recognised based on the approval of a particular Digital Services Coordinator and information on expertise and competence declared by the trusted flagger.
2021/07/08
Committee: IMCO
Amendment 1298 #
Proposal for a regulation
Article 19 – paragraph 4
4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. Notices referred to in paragraph 1 of this Article shall be proceeded with priority with respect to the geographical scope of the trusted flagger, according to awarding of the status by Member States.
2021/07/08
Committee: IMCO
Amendment 1300 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
4a. Trusted flaggers shall provide the Digital Services Coordinator of establishment with clear and accessible reports on notices they sent during the relevant period, at least once every three years. Those reports shall include information on: (a) the number of notices submitted in accordance with Article 14, categorised by the type of presumed illegal content concerned; (b) the number and percentage of notices that led to the removal or suspension of the content concerned; and (c) the number of notices that were considered to be insufficiently precise or inadequately substantiated by the online platforms.
2021/07/08
Committee: IMCO
Amendment 1330 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platforms shallProviders of hosting services may suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
2021/07/08
Committee: IMCO
Amendment 1380 #
Proposal for a regulation
Article 22 – paragraph 1 – point b
(b) a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council50 ; __________________ 50Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/ECdeleted
2021/07/08
Committee: IMCO
Amendment 1382 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bank account details of the trader, where the trader is a natural personas defined by Council Directive 2011/16/EU of 15 February 2011 on administrative cooperation in the field of taxation, the Financial Account Identifier to which the Consideration is paid or credited, insofar as it is available to the online platform;
2021/07/08
Committee: IMCO
Amendment 1390 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
(f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law.deleted
2021/07/08
Committee: IMCO
Amendment 1408 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the identification of the trader as information referred to in points (a), (b), (c) (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online databasreliable and independent source or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources.
2021/07/08
Committee: IMCO
Amendment 1435 #
Proposal for a regulation
Article 22 – paragraph 4
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information in accordance with applicable laws.
2021/07/08
Committee: IMCO
Amendment 1441 #
Proposal for a regulation
Article 22 – paragraph 6
6. The online platform shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner. Where certain information may not be disclosed for privacy reasons, the online platform shall disclose the information in a way that is not detrimental to the trader’s business operations. The online platform shall also provide effective means for the recipients of the service to enter in direct contact with the trader, whether through the information referred to in paragraph 1(b) or (c) or through any other electronic means made available by the online platform.
2021/07/08
Committee: IMCO
Amendment 1475 #
Proposal for a regulation
Article 23 – paragraph 2
2. Online platforms shall publish, at least once every six months, information on the average monthly active recipientend users of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
2021/07/08
Committee: IMCO
Amendment 1533 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide for at least four consecutive months their services to a number of average monthly active recipientend users of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
2021/07/08
Committee: IMCO
Amendment 1535 #
Proposal for a regulation
Article 25 – paragraph 3
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipientend users of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipientend users of the service in the Union, taking into account different accessibility features.
2021/07/08
Committee: IMCO
Amendment 1538 #
Proposal for a regulation
Article 25 – paragraph 4 – subparagraph 1
The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipientend users of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission.
2021/07/08
Committee: IMCO
Amendment 1539 #
Proposal for a regulation
Article 25 – paragraph 4 a (new)
4a. After receiving the decision about the designation as a very large online platform, the online platform may appeal this decision before the Digital Services Coordinator issuing the designation within 60 days. The Digital Services Coordinator may consult the Board. The Digital Services Coordinator shall especially consider the following information while assessing the appeal: (a) the type of content usually shared and the type of the active end user on a given online platform; (b) the exposure to the illegal content as reported under Article 23 and measures taken to mitigate the risks by the online platform; and (c) the exposure to the systemic risks as referred to in Article 26. The Digital Services Coordinator shall decide on the appeal within 60 days. The Digital Services Coordinator may repeatedly initiate this procedure when deemed necessary, after accepting the appeal.
2021/07/08
Committee: IMCO
Amendment 1541 #
Proposal for a regulation
Article 25 – paragraph 4 b (new)
4b. The Digital Services Coordinator of establishment may request any online platform to submit a report assessing the dissemination of illegal content through their services, when justified by the information provided in the report submitted in accordance with Article 23. If, after thorough assessment, the Digital Services Coordinator has identified the platform in question as posing significant systemic risks stemming from dissemination of illegal content through their services in the Union, the Digital Services Coordinator may then require proportionate compliance with some or all obligations of Articles 26 to 37.
2021/07/08
Committee: IMCO
Amendment 1542 #
Proposal for a regulation
Article 25 – paragraph 4 c (new)
4c. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific methodology for the purpose of paragraph 4a and 4b.
2021/07/08
Committee: IMCO
Amendment 1693 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in or any otheir systerms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used used to determine the order of presentation of content, including their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameose which decrease the visibility of content, shall set out in their terms that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in these systems.
2021/07/08
Committee: IMCO
Amendment 1696 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
1a. The main parameters referred to in paragraph 1 of this Article shall include, at least the following elements: (a) the main criteria used by the relevant recommender system; (b) how these criteria are prioritised; (c) the optimisation goal of the relevant recommender system; and (d) an explanation of the role that the behaviour of the recipients of the service plays in how the relevant recommender system functions.
2021/07/08
Committee: IMCO
Amendment 1699 #
Proposal for a regulation
Article 29 – paragraph 1 b (new)
1b. Very large online platforms shall provide options for the recipients of the service to modify or influence parameters referred to in paragraph 2, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1700 #
Proposal for a regulation
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, vVery large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to; (a) select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them; (b) select third party recommender systems.
2021/07/08
Committee: IMCO
Amendment 1849 #
Proposal for a regulation
Article 35 – paragraph 1
1. The Commission and the Board shall encouragehave the right to request and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
2021/07/08
Committee: IMCO
Amendment 1855 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may inviteshall request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/08
Committee: IMCO
Amendment 1936 #
Proposal for a regulation
Article 40 – paragraph 3 a (new)
3a. Member State shall have jurisdiction for the purposes of Chapters III and IV of this Regulation where providers online social networking services designated as very large online platforms are concerned, as defined in Article 25 and which offer services to a significant number of active end users of the service in a given Member State which can be calculated on the basis of Article 23(2).
2021/07/08
Committee: IMCO
Amendment 1967 #
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Assessment of the complaint can be supplemented by the opinion of Digital Services Coordinator of the Member State, where the recipient resides or is established, on how the matter should be resolved taking into account national law and socio-cultural context of a given Member State. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
2021/07/08
Committee: IMCO
Amendment 1970 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Pursuant to paragraph 1 of this Article, the Digital Services Coordinator of establishment, in cases concerning a complaint transmitted by the Digital Services Coordinator of the Member State where the recipient resides or is established, shall assess the matter in a timely manner and shall inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled.
2021/07/08
Committee: IMCO
Amendment 1976 #
Proposal for a regulation
Article 44 – paragraph 2 – point a
(a) the number and subject matter of orders to act against illegal content and orders to provide information, including at least information on the name of the issuing authority, the name of the provider and the type of action specified in the order, issued in accordance with Articles 8, 8a and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;
2021/07/08
Committee: IMCO
Amendment 1981 #
Proposal for a regulation
Article 44 – paragraph 2 a (new)
2a. Based on the information published by Digital Services Coordinators, the Commission shall submit to the European Parliament and to the Council a dedicated biennial report analysing the aggregated data on orders referred to in Articles 8, 8a and 9 and issued by the Digital Services Coordinators, with a special attention being paid to potential abusive use of these Articles. The report shall provide a comprehensive overview of the orders to act against illegal content and it shall provide, for a specific period of time, the possibility to assess the activities of Digital Services Coordinators.
2021/07/08
Committee: IMCO
Amendment 1982 #
Proposal for a regulation
Article 44 – paragraph 3 a (new)
3a. The Commission shall adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 4 of Regulation (EU) No 182/2011.
2021/07/08
Committee: IMCO
Amendment 1988 #
Proposal for a regulation
Article 45 – paragraph 1 a (new)
1a. A request or recommendation pursuant to paragraph 1 of this Article shall not preclude the possibility of Digital Services Coordinator of the Member State where the recipient of the service resides or is established, to be able to carry out its own investigation concerning a suspected infringement of this Regulation by a provider of an intermediary service.
2021/07/08
Committee: IMCO
Amendment 1994 #
Proposal for a regulation
Article 45 – paragraph 2 a (new)
2a. A recommendation pursuant to paragraphs 1 and 2 of this Article may additionally indicate: (a) an opinion on matters that involve taking into account national law and socio-cultural context; and (b) a draft decision based on investigation pursuant to paragraph 1a of this Article.
2021/07/08
Committee: IMCO
Amendment 2011 #
Proposal for a regulation
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information shall be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1.
2021/07/08
Committee: IMCO
Amendment 2072 #
Proposal for a regulation
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure, following the consent of and inform the Commission thereof.
2021/07/08
Committee: IMCO
Amendment 2086 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
(d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measuradopt opinions on issues concerning very large online platforms in accordance with this Regulation;
2021/07/08
Committee: IMCO
Amendment 2090 #
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
(ea) issue opinions, recommendations or advice on matters related to Article 34.
2021/07/08
Committee: IMCO
Amendment 2141 #
Proposal for a regulation
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, their legal representatives, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period.
2021/07/08
Committee: IMCO
Amendment 2201 #
Proposal for a regulation
Article 58 – paragraph 5
5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision. and order the power to remove content or to restrict access to an online interface or to order the explicit display of a warning to recipients when they access an online interface;
2021/07/08
Committee: IMCO
Amendment 2206 #
Proposal for a regulation
Article 58 – paragraph 5 a (new)
5a. The decision ordered pursuant to paragraph 5 should be executable with immediate effect.
2021/07/08
Committee: IMCO
Amendment 2281 #
Proposal for a regulation
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 13, 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
2021/07/08
Committee: IMCO
Amendment 2283 #
Proposal for a regulation
Article 69 – paragraph 3
3. The delegation of power referred to in Articles 13, 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
2021/07/08
Committee: IMCO
Amendment 2286 #
Proposal for a regulation
Article 69 – paragraph 5
5. A delegated act adopted pursuant to Articles 13, 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of threefour months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.
2021/07/08
Committee: IMCO
Amendment 2290 #
Proposal for a regulation
Article 73 – paragraph 1
1. By fivthree years after the entry into force of this Regulation at the latest, and every fivthree years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation.
2021/07/08
Committee: IMCO
Amendment 2291 #
Proposal for a regulation
Article 73 – paragraph 4
4. By three years from the date of application of this Regulation at the latest, the Commission, after consulting the Board, shall carry out an assessment of the functioning of the Board and shall report it to the European Parliament, the Council and the European Economic and Social Committee, taking into account the first years of application of the Regulation. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation with regard to the structure of the Board.deleted
2021/07/08
Committee: IMCO
Amendment 2295 #
Proposal for a regulation
Article 74 – paragraph 2
2. It shall apply from [date - thrsixteen months after its entry into force].
2021/07/08
Committee: IMCO