BETA

Activities of Axel VOSS related to 2020/0361(COD)

Legal basis opinions (0)

Amendments (182)

Amendment 104 #
Proposal for a regulation
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States canshould be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. _________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
2021/07/20
Committee: JURI
Amendment 115 #
Proposal for a regulation
Recital 11
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected and are lex specialis, prevailing over this Regulation.
2021/07/20
Committee: JURI
Amendment 123 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant, dangerous or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/20
Committee: JURI
Amendment 131 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of anotherthe principal service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/07/20
Committee: JURI
Amendment 141 #
Proposal for a regulation
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. A provider of intermediary services plays an active role when assistance is given to the recipient of the service, notably for the optimizing and the promotion of the content offered. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
2021/07/20
Committee: JURI
Amendment 144 #
Proposal for a regulation
Recital 20
(20) AThe provider should not be able to benefit from exemptions from liability provided for in this Regulation where the main purpose is to engage in or facilitate illegal activities or where a provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities and does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
2021/07/20
Committee: JURI
Amendment 153 #
Proposal for a regulation
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders on the platforms, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well- informed consumer.
2021/07/20
Committee: JURI
Amendment 181 #
Proposal for a regulation
Recital 32
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, ois information, which should include the relevant email addresses, telephone numbers, IP addresses and other contact details necessary to ensure such compliance, should be available in respect of all types orders. Orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
2021/07/20
Committee: JURI
Amendment 186 #
Proposal for a regulation
Recital 2 a (new)
(2a) Moreover, complex national regulatory requirements, fragmented implementation and insufficient enforcement of legislation such as Directive 2000/31/EC have contributed to high administrative costs and legal uncertainty for intermediary services operating on the internal market, especially micro, small and medium sized companies.
2021/07/08
Committee: IMCO
Amendment 196 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.deleted
2021/07/20
Committee: JURI
Amendment 204 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content.deleted
2021/07/20
Committee: JURI
Amendment 213 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/07/20
Committee: JURI
Amendment 218 #
Proposal for a regulation
Recital 43 a (new)
(43 a) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide, based on its own assessment, whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
2021/07/20
Committee: JURI
Amendment 219 #
Proposal for a regulation
Recital 43 b (new)
(43 b) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content.
2021/07/20
Committee: JURI
Amendment 245 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However,Online advertising is a significant source of financing for many digital business models and an effective tool to reach new costumers, not least for small- and medium sized companies. However, there are some instances when online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition toBased on the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/20
Committee: JURI
Amendment 249 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Services, such as internet infrastructure services or cloud service providers, which are provided at the request of parties other than the content providers and only indirectly benefitting the latter, should not be covered by the definition of online platforms. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/08
Committee: IMCO
Amendment 257 #
Proposal for a regulation
Recital 17
(17) The relevant rules of Chapter II should only establish when the provider of intermediary services concerned cannot be held liable in relation to illegal content provided by the recipients of the service. Those rules should notby no means be understood to provide a positive basis for establishing when a provider can be held liable, which is for the applicable rules of Union or national law to determine. Furthermore, the exemptions from liability established in this Regulation should apply in respect of any type of liability as regards any type of illegal content, irrespective of the precise subject matter or nature of those laws.
2021/07/08
Committee: IMCO
Amendment 265 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary and proportionate means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/20
Committee: JURI
Amendment 277 #
Proposal for a regulation
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significantn impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
2021/07/19
Committee: JURI
Amendment 278 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expedwitihouslt undue delay to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/07/08
Committee: IMCO
Amendment 279 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms could pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
2021/07/19
Committee: JURI
Amendment 281 #
Proposal for a regulation
Recital 22 a (new)
(22a) The exemption of liability should not apply where the recipient of the service is acting under the authority or the control of the provider of a hosting service. In particular, where the provider of the online platform that allows consumers to conclude distance contracts with traders does not allow traders to determine the basic elements of the trader-consumer contract, such as the terms and conditions governing such relationship or the price, it should be considered that the trader acts under the authority or control of that platform.
2021/07/08
Committee: IMCO
Amendment 282 #
Proposal for a regulation
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders as a functionality of their service, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. This is the case where the online platform operator fails to clearly display the identity of the trader following this Regulation. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. In particular, it is relevant whether the online platform operator withholds such identity or contract details until after the conclusion of the trader- consumer contract, or is marketing the product or service in its own name rather than using the name of the trader who will supply it.
2021/07/08
Committee: IMCO
Amendment 282 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for providing information or compelling access to data from very large online platforms to vetted researchers. All requirements f where relevant to a research project. All requests for providing information or access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/19
Committee: JURI
Amendment 291 #
Proposal for a regulation
Recital 23 a (new)
(23a) Consumers should be able to safely purchase products and services online, irrespective of whether a product or service has been produced in the Union. For that reason, traders from third countries should establish a legal representative in the Union to whom claims regarding product safety could be addressed. Providers of intermediary services from inside the Union as well as from third countries should ensure compliance with product requirements set out in Union law.
2021/07/08
Committee: IMCO
Amendment 301 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan.
2021/07/19
Committee: JURI
Amendment 305 #
Proposal for a regulation
Recital 70
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives.
2021/07/19
Committee: JURI
Amendment 310 #
Proposal for a regulation
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, cloud infrastructure services or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
2021/07/08
Committee: IMCO
Amendment 331 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. Since intermediaries should not be required to remove information which is legal in their country of establishment, national and Union authorities should be able to order the blocking of content legally published outside the Union only for the territory of the Union where Union law is infringed and for the territory of the issuing Member State where national law is infringed.
2021/07/08
Committee: IMCO
Amendment 346 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should target illegal content and aim in particular to guarantee different public policy objectives such as consumer protection, the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 353 #
Proposal for a regulation
Recital 35
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature and size of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non- arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.
2021/07/08
Committee: IMCO
Amendment 358 #
Proposal for a regulation
Recital 36 a (new)
(36a) Providers of intermediary services should also establish a single point of contact for recipients of services, allowing rapid, direct and efficient communication.
2021/07/08
Committee: IMCO
Amendment 362 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. Obligations related to terms and conditions should not oblige a provider of an intermediary service to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets or intellectual property rights.
2021/07/08
Committee: IMCO
Amendment 372 #
Proposal for a regulation
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or, small or medium sized enterprises as defined in Commission Recommendation 2003/361/EC.40 __________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/08
Committee: IMCO
Amendment 373 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
— a significant number of users in relation to their population in one or more Member States; or
2021/07/19
Committee: JURI
Amendment 376 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action')following the applicable law ('action'). Such mechanisms should be clearly visible on the interface of the hosting service and easy to use. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. Providers of hosting services could, as a voluntary measure, conduct own-investigation measures to prevent content which has previously been identified as illegal from being disseminated again once removed. The obligations related to notice and action should by no means impose general monitoring obligations.
2021/07/08
Committee: IMCO
Amendment 385 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content. Providers of hosting services should act upon notices without undue delay, taking into account the type of illegal content that is being notified and the urgency of taking action. The provider of hosting services should inform the individual or entity notifying the specific content of its decision without undue delay after taking a decision whether to act upon the notice or not.
2021/07/08
Committee: IMCO
Amendment 388 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of anotherthe principal service and, for objective and technical reasons cannot be used without that otherprincipal service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.
2021/07/19
Committee: JURI
Amendment 389 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(h a) ‘editorial platform’ means an intermediary service which is in connection with a press publication within the meaning of Article 2(4) of Directive (EU) 2019/790 or another editorial media service and which allows users to discuss topics generally covered by the relevant media or to comment editorial content and which is under the supervision of the editorial team of the publication or other editorial media.
2021/07/19
Committee: JURI
Amendment 396 #
Proposal for a regulation
Recital 42 a (new)
(42a) A hosting service provider may in some instances become aware, for instance through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the hosting service provider is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council. In such instances, the hosting service provider should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by hosting service providers. Hosting service providers should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities.
2021/07/08
Committee: IMCO
Amendment 397 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
(i a) 'live streaming platform services' means an information society service which main or one the main purposes is to give the public access to live broadcasted audio or video material and which it organises and promotes for profit-making purposes;
2021/07/19
Committee: JURI
Amendment 400 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system, used by an very large online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
2021/07/19
Committee: JURI
Amendment 401 #
Proposal for a regulation
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or, small or medium sized enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. __________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/08
Committee: IMCO
Amendment 418 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. Providers of hosting services could, as a voluntary measure, introduce own-investigation measures to prevent accounts which have previously been identified as illegal from reappearing once removed. The obligations related to notice and action should by no means impose general monitoring obligations. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/08
Committee: IMCO
Amendment 419 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiouslyaccording within the deadlines of Article 5Ia new when it comes to remove or toing or disableing access to the illegal content.
2021/07/19
Committee: JURI
Amendment 421 #
Proposal for a regulation
Article 5 – paragraph 1 a (new)
1 a. Without prejudice to specific deadlines, set out in Union law or within administrative or legal orders, providers of hosting services shall, upon obtaining actual knowledge or awareness, remove or disable access to illegal content as soon as possible and in any event: (a) within 30 minutes where the illegal content pertains to the broadcast of a live sports or entertainment event; (b) within 24 hours where the illegal content can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety; (c) within seven days in all other cases where the illegal content does not seriously harm public policy, public security, public health or consumers’ health or safety; Where the provider of hosting services cannot comply with the obligation in paragraph 1a on grounds of force majeure or for objectively justifiable technical or operational reasons, it shall, without undue delay, inform the competent authority.
2021/07/19
Committee: JURI
Amendment 423 #
Proposal for a regulation
Article 5 – paragraph 2 a (new)
2 a. Paragraph 1 shall not apply when the main purpose of the information society service is to engage in or facilitate illegal activities or when the provider of the information society service deliberately collaborates with a recipient of the services in order to undertake illegal activities.
2021/07/19
Committee: JURI
Amendment 425 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders on the platform, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
2021/07/19
Committee: JURI
Amendment 430 #
Proposal for a regulation
Article 5 a (new)
Article 5 a The exemptions from liability established in Articles 3, 4 and 5 shall not apply where the information society service plays an active role of such a kind as to give it knowledge of, or control over the information provided by the recipient of the service.
2021/07/19
Committee: JURI
Amendment 445 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 ,and the Union Rapid Alert System for dangerous non-food products (Rapex) or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . __________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/07/08
Committee: IMCO
Amendment 455 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However,Online advertising is a significant source of financing for many digital business models and an effective tool to reach new customers, not least for small- and medium sized companies. However, there are some instances when online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. To ensure consumer protection online advertisement should be subject to proportionate and meaningful transparency obligations. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/08
Committee: IMCO
Amendment 469 #
Proposal for a regulation
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses could have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and meansAccordingly, the number of average monthly recipients of the service should reflect the recipients actually reached by the service either by being exposed to content or by providing content disseminated on the platforms’ interface in that period of time. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. The threshold should be designed to target the largest platforms with a reach in the Union that could lead to a systemic impact. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means, placing such due diligence obligations on smaller companies, especially micro, small and medium sized companies would be disproportionate.
2021/07/08
Committee: IMCO
Amendment 474 #
Proposal for a regulation
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design of their services is generally optimised to benefit their often advertising- driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, withoutsometimes amplify the dissemination of illegal content. Effective regulation and enforcement is needed to effectively identifying and mitigatinge the risks and the societal and economic harm they can cauat may arise. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
2021/07/08
Committee: IMCO
Amendment 490 #
Proposal for a regulation
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delayin 30 days following its adoption, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
2021/07/08
Committee: IMCO
Amendment 496 #
Proposal for a regulation
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. Often, they facilitate the search for relevant content for recipients of the service and contribute to an improved user experience. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them through making active choices. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them and why. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
2021/07/08
Committee: IMCO
Amendment 500 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms could pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
2021/07/08
Committee: IMCO
Amendment 503 #
Proposal for a regulation
Article 9 – paragraph 2 – point b
(b) the order only requires the provider to provide information already collected for the purposes of providing the service and which lies within its control, including email addresses, telephone numbers, IP addresses and other contact details necessary to determine the compliance referred to in (a);
2021/07/19
Committee: JURI
Amendment 506 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, where relevant to a research project. All requiremenests for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/08
Committee: IMCO
Amendment 525 #
Proposal for a regulation
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. The Member States may require very large online platforms to designate a legal representative in their Member State.
2021/07/19
Committee: JURI
Amendment 528 #
Proposal for a regulation
Article 11 – paragraph 2
2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource to guarantee the proper and timely cooperateion with the Member States’ authorities, the Commission and the Board and comply with those decisions.
2021/07/19
Committee: JURI
Amendment 531 #
Proposal for a regulation
Article 11 a (new)
Article 11 a Exclusions Articles 12 and 13 of Section 1,and the provisions of Section 2, and Section 3 of Chapter III shall not apply to: (a) editorial platforms within the meaning of Article 2(h a) of this Regulation; (b) online platforms that qualify as micro and medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC. (c) an intermediary service, except very large online platforms, where it would constitute a disproportionate burden in view of its size, the nature of its activity and the risk posed to users.
2021/07/19
Committee: JURI
Amendment 537 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions, which have to respect European and national law. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
2021/07/19
Committee: JURI
Amendment 545 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Where very large online platforms within the meaning of Article 25 of this Regulation otherwise allow for the dissemination to the public of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790, such platforms shall not remove, disable access to, suspend or otherwise interfere with such content or the related service or suspend or terminate the related account on the basis of the alleged incompatibility of such content with its terms and conditions, unless it is illegal content
2021/07/19
Committee: JURI
Amendment 564 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised, where possible, by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders;.
2021/07/19
Committee: JURI
Amendment 566 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
2021/07/19
Committee: JURI
Amendment 570 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, where identifiable, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/07/19
Committee: JURI
Amendment 575 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a small or microenterprise within the meaning of the Annex to Recommendation2003/361/EC during the twelve months following their loss of that status.
2021/07/19
Committee: JURI
Amendment 586 #
2021/07/19
Committee: JURI
Amendment 614 #
Proposal for a regulation
Article 1 – paragraph 2 – point b – point i (new)
i) facilitate innovations, support digital transition, encourage economic growth and create a level playing field for digital services within the internal market while strengthening consumer protection and contributing to increased consumer choice.
2021/07/08
Committee: IMCO
Amendment 634 #
Proposal for a regulation
Article 15 – paragraph 4
4. Providers of hostingaragraphs 2, 3 and 4 shall not apply to providers of intermediary services sthall publish the decisions and the statements of reast qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. In additions, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal datathose paragraphs shall not apply to enterprises that previously qualified for the status of a micro or small enterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status.
2021/07/19
Committee: JURI
Amendment 647 #
Proposal for a regulation
Article 16 a (new)
Article 16a Notice and action mechanism 1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. 2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: (a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content; (b) to the extent possible a clear indication of the electronic location of that information, and, where necessary, additional information enabling the identification of the illegal content; (c) the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU; (d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are to the best of their knowledge accurate and complete. 3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. 4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly send a confirmation of receipt of the notice to that individual or entity. 5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. 6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, within the timelines of Article 5 1a and in a diligent and objective manner. Where they use automated means for that processing or decision- making, they shall include information on such use in the notification referred to in paragraph 4.
2021/07/19
Committee: JURI
Amendment 659 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
— a significant number of users in one or more Member States; ordeleted
2021/07/08
Committee: IMCO
Amendment 669 #
Proposal for a regulation
Article 2 – paragraph 1 – point e
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;
2021/07/08
Committee: IMCO
Amendment 680 #
Proposal for a regulation
Article 18 – paragraph 1 – introductory part
1. RAfter internal complaint handling mechanisms are exhausted, recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of- court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
2021/07/19
Committee: JURI
Amendment 686 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
2021/07/08
Committee: IMCO
Amendment 697 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider (h) of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. Infrastructure services such as webhosting or cloud service providers shall not be covered by the definition of online platforms;
2021/07/08
Committee: IMCO
Amendment 710 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delayimmediately processed without prejudice to the implementation of a complaint and redress mechanism.
2021/07/19
Committee: JURI
Amendment 720 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed and disseminated to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically in exchange for promoting that information;
2021/07/08
Committee: IMCO
Amendment 743 #
Proposal for a regulation
Article 19 – paragraph 7 a (new)
7a. Online platforms shall, where possible, provide trusted flaggers with access to technical means that help them detect illegal content on a large scale.
2021/07/19
Committee: JURI
Amendment 760 #
(d) where identifiable, the intention of the recipient, individual, entity or complainant.
2021/07/19
Committee: JURI
Amendment 761 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expedwitihouslt undue delay to remove or to disable access to the illegal content.
2021/07/08
Committee: IMCO
Amendment 781 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with professional traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
2021/07/19
Committee: JURI
Amendment 782 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bank account details of the trader, where the trader is a natural person;deleted
2021/07/19
Committee: JURI
Amendment 787 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry outtake the necessary voluntary own-initiative investigations or other activiti measures aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation, without prejudice to freedom of expression.
2021/07/08
Committee: IMCO
Amendment 790 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Providers of intermediary services shall ensure that such measures are accompanied with appropriate safeguards, such as oversight, documentation and traceability or additional measures to ensure that own- initiative investigations are accurate, legally justified and do not lead to over- removal of content.
2021/07/08
Committee: IMCO
Amendment 822 #
Proposal for a regulation
Article 24
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: (a) an advertisement; (b) whose behalf the advertisement is displayed; (c) main parameters used to determine the recipient to whom the advertisement is displayed.Article 24 deleted Online advertising transparency that the information displayed is the natural or legal person on meaningful information about the
2021/07/19
Committee: JURI
Amendment 830 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
(ba) the territorial scope of an order addressed to a provider that has its main establishment or, if the provider is not established in the Union, its legal representation in another Member State is limited to the territory of the Member State issuing the order;
2021/07/08
Committee: IMCO
Amendment 833 #
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
(bb) if addressed to a provider that has its main establishment outside the Union, the territorial scope of the order, where Union law is infringed, is limited to the territory of the Union or, where national law is infringed, to the territory of the Member State issuing the order;
2021/07/08
Committee: IMCO
Amendment 857 #
Proposal for a regulation
Article 8 a (new)
Article 8a Injunction orders Member States shall ensure that recipients of a service are entitled under their national law to seek an injunction order as an interim measure for removing manifestly illegal content.
2021/07/08
Committee: IMCO
Amendment 865 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively caused by an illegal activity;
2021/07/19
Committee: JURI
Amendment 870 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative and illegal effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/19
Committee: JURI
Amendment 873 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/07/19
Committee: JURI
Amendment 918 #
Proposal for a regulation
Article 11 – paragraph 4 a (new)
4a. Providers of intermediary services that would qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC if established in the Union, and who have been unsuccessful in designating a legal representative after reasonable efforts, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including the possibility for collective representation.
2021/07/08
Committee: IMCO
Amendment 918 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
(b) any commitmentof voluntary measures undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.
2021/07/19
Committee: JURI
Amendment 925 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including information about algorithmic decision-making and human review. ItProviders of intermediary services shall also include information on the right to terminate the use of the service. The possibility to terminate must be easily accessible for the user. Information on remedies and redress mechanisms shall also be included in the terms and conditions. The terms and conditions shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
2021/07/08
Committee: IMCO
Amendment 931 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the mainshall base the parameters used inof their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679 on Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (P2B) and set them out in their terms and conditions.
2021/07/19
Committee: JURI
Amendment 935 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
1a. The parameters used in recommender systems shall always be fair and non-discriminatory.
2021/07/19
Committee: JURI
Amendment 938 #
Proposal for a regulation
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.deleted
2021/07/19
Committee: JURI
Amendment 946 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one yearfor advertisements that have been seen by more than 5000 recipients of the service and until six months after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/19
Committee: JURI
Amendment 950 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Obligations pursuant to paragraph 1 and 2 should not oblige a provider of an intermediary service to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets or intellectual property rights.
2021/07/08
Committee: IMCO
Amendment 953 #
Proposal for a regulation
Article 30 – paragraph 2 – point d
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose;deleted
2021/07/19
Committee: JURI
Amendment 954 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.deleted
2021/07/19
Committee: JURI
Amendment 966 #
Proposal for a regulation
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, provide information and access to data that are necessary to properly monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
2021/07/19
Committee: JURI
Amendment 969 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access toinformation and access to relevant data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1).
2021/07/19
Committee: JURI
Amendment 973 #
Proposal for a regulation
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 for a limited time and through online databases or application programming interfaces, as appropriate.
2021/07/19
Committee: JURI
Amendment 975 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular,including information on the following, as applicable:
2021/07/08
Committee: IMCO
Amendment 989 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c) meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures;
2021/07/08
Committee: IMCO
Amendment 1002 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, small or medium sized enterprises (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a medium-sized, small or micro-enterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
2021/07/08
Committee: IMCO
Amendment 1009 #
Proposal for a regulation
Article 34 – paragraph 1 – point f
(f) transmission of data between advertising intermediaries in support of transparency obligations pursuant to points (b) and (c) of Article 24.deleted
2021/07/19
Committee: JURI
Amendment 1016 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) in relation to the dissemination of illegal content emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/19
Committee: JURI
Amendment 1019 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested partierelevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/19
Committee: JURI
Amendment 1024 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives in relation to the dissemination of illegal content, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested partiethe relevant stakeholders, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.
2021/07/19
Committee: JURI
Amendment 1032 #
Proposal for a regulation
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 30 and 30Article 6 of Directive 2000/31/EC.
2021/07/19
Committee: JURI
Amendment 1034 #
Proposal for a regulation
Article 36 – paragraph 2 – introductory part
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30.
2021/07/19
Committee: JURI
Amendment 1035 #
Proposal for a regulation
Article 36 – paragraph 2 – point a
(a) the transmission of information held by providers of online advertising intermediaries to recipients of the service with regard to requirements set in points (b) and (c) of Article 24;deleted
2021/07/19
Committee: JURI
Amendment 1036 #
Proposal for a regulation
Article 36 – paragraph 2 – point b
(b) the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30.deleted
2021/07/19
Committee: JURI
Amendment 1044 #
2021/07/19
Committee: JURI
Amendment 1058 #
Proposal for a regulation
Article 41 – paragraph 2 – point e
(e) the power to proportionate adopt interim measures to avoid the risk of serious harm.
2021/07/19
Committee: JURI
Amendment 1060 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 on the basis of which a diligent provider of hosting services is able to assess the illegality of the content in question, shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
2021/07/08
Committee: IMCO
Amendment 1060 #
1. Member States shall lay down the rules on penalties including administrative fines applicable to infringements of this Regulation by providers of intermediary services under their jurisdiction and shall take all the necessary measures to ensure that they are properly and effectively implemented in accordance with Article 41.
2021/07/19
Committee: JURI
Amendment 1061 #
Proposal for a regulation
Article 42 – paragraph 2
2. Penalties shall be effective, proportionate and dissuasive. They shall take into particular account the interest of small scale providers and start ups and their economic viability. Member States shall notify the Commission of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them.
2021/07/19
Committee: JURI
Amendment 1064 #
Proposal for a regulation
Article 14 – paragraph 4
4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly, without undue delay, send a confirmation of receipt of the notice to that individual or entity.
2021/07/08
Committee: IMCO
Amendment 1081 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Providers of hosting services could, as a voluntary measure in line with provisions Article 6, conduct own- investigation measures to prevent illegal content which has previously been identified as illegal from being disseminated again once removed. The obligations related to paragraph 1 to 6 shall by no means impose general monitoring obligations on hosting services.
2021/07/08
Committee: IMCO
Amendment 1087 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
6b. Paragraphs 2, 4 and 5 shall not apply to providers of intermediary services that qualify as micro, small or medium- sized enterprises (SMEs) within the meaning of the Annex to Recommendations 2003/361/EU, or to those enterprises within twelve months of them losing such status pursuant to Article 4(2) thereof.
2021/07/08
Committee: IMCO
Amendment 1089 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
6c. Paragraph 2 and 4-5 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
2021/07/08
Committee: IMCO
Amendment 1103 #
Proposal for a regulation
Article 47 – paragraph 2 – point a a (new)
(aa) contributing to the effective application of Directive 2000/31/EC Article 3 to prevent fragmentation of the digital single market and the obligations of very large platforms of Article 5 of the Platform to Business Regulation 2019/1150
2021/07/19
Committee: JURI
Amendment 1120 #
Proposal for a regulation
Article 15 – paragraph 4
4. Providers of hosting services shall publishupon request share the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commissionwith the Digital Service Coordinator of establishment. That information shall not contain personal data.
2021/07/08
Committee: IMCO
Amendment 1122 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
4a. Paragraph 2 to 4 shall not apply to providers of intermediary services that qualify as micro, small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC, or during the first twelve months from when an enterprise lost such status as pursuant to Article 4(2) thereof.
2021/07/08
Committee: IMCO
Amendment 1132 #
Proposal for a regulation
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order proportionate interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement.
2021/07/19
Committee: JURI
Amendment 1135 #
Proposal for a regulation
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, and, where necessary access to its databases and algorithms.
2021/07/19
Committee: JURI
Amendment 1138 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
2. The Commission may by decision and in compliance with the proportionality principle impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or negligently:
2021/07/19
Committee: JURI
Amendment 1151 #
Proposal for a regulation
Article 74 – paragraph 2 – introductory part
2. It shall apply from [date - threesix months after its entry into force].
2021/07/19
Committee: JURI
Amendment 1157 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or not to remove or disable access to the information;
2021/07/08
Committee: IMCO
Amendment 1158 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b) decisions to suspend or terminate or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
2021/07/08
Committee: IMCO
Amendment 1161 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) decisions to suspend or terminate or not to suspend or terminate the recipients’ account.
2021/07/08
Committee: IMCO
Amendment 1166 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to radically restrict the visibility of content provided by the recipients,
2021/07/08
Committee: IMCO
Amendment 1203 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1) and individuals or entities that have submitted notices, shall be entitled to select any out- of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
2021/07/08
Committee: IMCO
Amendment 1213 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independentndependent, including financially independent, and impartial of online platforms and recipients of the service provided by the online platforms and of individuals or entities that have submitted notices;
2021/07/08
Committee: IMCO
Amendment 1221 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible through electronic communication technology and provides for the possibility to submit a complaint and the requisite supporting documents online;
2021/07/08
Committee: IMCO
Amendment 1236 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure that are clearly visible and easily accessible to all parties concerned and in full compliance with all applicable law.
2021/07/08
Committee: IMCO
Amendment 1242 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
2a. The Digital Services Coordinator shall reassess on a yearly basis whether the certified out-of-court dispute settlement body continues to fulfil the listed criteria. If this is not the case, the Digital Services Coordinator shall revoke the status from the out-of-court dispute settlement body.
2021/07/08
Committee: IMCO
Amendment 1251 #
Proposal for a regulation
Article 18 – paragraph 5
5. Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph as well as out-of-court dispute settlement bodies whose status has been revoked. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated.
2021/07/08
Committee: IMCO
Amendment 1262 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by certified trusted flaggers, within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay, depending on the severity of the illegal activity.
2021/07/08
Committee: IMCO
Amendment 1278 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform;
2021/07/08
Committee: IMCO
Amendment 1296 #
Proposal for a regulation
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or have been revoked in accordance with paragraph 6.
2021/07/08
Committee: IMCO
Amendment 1308 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, carried out without undue delay, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
2021/07/08
Committee: IMCO
Amendment 1339 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
(d) where identifiable, the intention of the recipient, individual, entity or complainant.
2021/07/08
Committee: IMCO
Amendment 1349 #
Proposal for a regulation
Article 20 – paragraph 4 a (new)
4a. Providers of hosting services could, as a voluntary measure in line with provisions Article 6, conduct own- investigation measures to prevent suspended accounts from reappearing before the suspension is lifted. The obligations related to paragraph 1 to 4 shall by no means impose general monitoring obligations on hosting services.
2021/07/08
Committee: IMCO
Amendment 1381 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bank account details of the trader, where the trader is a natural person;deleted
2021/07/08
Committee: IMCO
Amendment 1391 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
(f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law and where applicable confirming that all products have been checked against the Union Rapid Alert System for dangerous non-food products (Rapex).
2021/07/08
Committee: IMCO
Amendment 1404 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) (e) and (ef) of paragraph 1 is reliable through the use of any freely accessible official online database, like the Rapex system or online interfaces made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. The online platform shall require that traders promptly inform them of any changes to the information referred to in points (a), (d), (e) and (f) and regularly repeat this verification process.
2021/07/08
Committee: IMCO
Amendment 1413 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Where the online platform obtains indications that anyinformation under paragraph 1, letter (f) is inaccurate it shall remove the product or service directly from their online platform and if any other item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/07/08
Committee: IMCO
Amendment 1470 #
Proposal for a regulation
Article 23 – paragraph 1 – point c
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied.
2021/07/08
Committee: IMCO
Amendment 1473 #
Proposal for a regulation
Article 23 – paragraph 2
2. Online platforms shall pucommunicate to the Digital Services Coordinator of establishment, at least once every sixtwelve months, information on the average monthly active recipients of the service in each Member Statethe Union, calculated as an average over the period of the past sixtwelve months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
2021/07/08
Committee: IMCO
Amendment 1476 #
Proposal for a regulation
Article 23 – paragraph 2 a (new)
2a. Member States shall refrain from imposing additional transparency reporting obligations on the online platforms, other than specific requests in the context of exercising their supervisory powers.
2021/07/08
Committee: IMCO
Amendment 1510 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
2. Online platforms shall provide information mentioned in paragraph 1 to public authorities, upon their request, in order to determine accountability in case of false or misleading advertisement.
2021/07/08
Committee: IMCO
Amendment 1552 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use madedissemination of illegal content ofn their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1567 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively through dissemination of illegal content;
2021/07/08
Committee: IMCO
Amendment 1577 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative and illegal effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/08
Committee: IMCO
Amendment 1587 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1596 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
2a. The obligations detailed in paragraphs 1 and 2 shall by no means lead to a general monitoring obligation
2021/07/08
Committee: IMCO
Amendment 1604 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures targeting illegal practices, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1661 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
(b) any voluntary commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.
2021/07/08
Committee: IMCO
Amendment 1665 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned and have not provided any other service to the platform in the previous 12 months;
2021/07/08
Committee: IMCO
Amendment 1702 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2a. Obligations pursuant to paragraphs 1 and 2 shall not oblige a very large online platform to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets and intellectual property rights. Further, very large online platforms shall not be required to enable modification of systems essential to uphold the safety and security of the service.
2021/07/08
Committee: IMCO
Amendment 1719 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one yearsix months after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/08
Committee: IMCO
Amendment 1767 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, disclose the funding of the research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
2021/07/08
Committee: IMCO
Amendment 1783 #
Proposal for a regulation
Article 31 – paragraph 7
7. Requests for amendment pursuant to point (b) of paragraph 6 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request. The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request.deleted
2021/07/08
Committee: IMCO
Amendment 1798 #
Proposal for a regulation
Article 33 – paragraph 1
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every sixtwelve months.
2021/07/08
Committee: IMCO
Amendment 1846 #
Proposal for a regulation
Article 35 – paragraph 1
1. The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. The Commission shall also encourage and facilitate regular review and adaption of the Codes of conduct to ensure that they are fit for purpose.
2021/07/08
Committee: IMCO
Amendment 1853 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested partierelevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/08
Committee: IMCO
Amendment 1864 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments should take into account differences in size and capacity between different participants.
2021/07/08
Committee: IMCO
Amendment 1883 #
Proposal for a regulation
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30.
2021/07/08
Committee: IMCO
Amendment 1897 #
Proposal for a regulation
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
2021/07/08
Committee: IMCO
Amendment 1945 #
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 1 – point e
(e) the power to adopt proportionate interim measures to avoid the risk of serious harm, without prejudice to fundamental rights.
2021/07/08
Committee: IMCO
Amendment 1978 #
Proposal for a regulation
Article 44 – paragraph 2 – point b a (new)
(ba) the conditions met to justify any order to act against illegal content and to provide information taken that derogates from the internal market clause in accordance with Article 3 of Directive 2000/31/EC.
2021/07/08
Committee: IMCO
Amendment 2039 #
Proposal for a regulation
Article 47 – paragraph 2 – point a a (new)
(aa) contributing to the effective application of Article 3 of Directive 2000/31/EC to prevent fragmentation of the digital single market;
2021/07/08
Committee: IMCO
Amendment 2088 #
Proposal for a regulation
Article 49 – paragraph 1 – point d a (new)
(da) monitor derogations from the internal market clause in accordance with Article 3 of Directive 2000/31/EC and ensure that the conditions for derogation are interpreted strictly and narrowly to ensure consistent application of this Regulation;
2021/07/08
Committee: IMCO
Amendment 2164 #
Proposal for a regulation
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order proportionate interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement, without prejudice to fundamental rights.
2021/07/08
Committee: IMCO
Amendment 2182 #
Proposal for a regulation
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, and where necessary access to, its databases and algorithms.
2021/07/08
Committee: IMCO
Amendment 2212 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
2. The Commission may by decision and in compliance with the proportionality principle impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or as a result of repeated negligentlyce:
2021/07/08
Committee: IMCO
Amendment 2296 #
Proposal for a regulation
Article 74 – paragraph 2
2. It shall apply from [date - threwelve months after its entry into force].
2021/07/08
Committee: IMCO