BETA

336 Amendments of Liesje SCHREINEMACHER related to 2020/0361(COD)

Amendment 100 #
Proposal for a regulation
Recital 4 a (new)
(4 a) As Party to the United Nations Convention on the Rights of Persons with Disabilities (UN CRPD), provisions of the Convention are integral part of the Union legal order and binding upon the Union and its Member States. The UN CRPD requires its Parties to take appropriate measures to ensure that persons with disabilities have access, on an equal basis with others, to information and communications technologies and systems, and other facilities and services open or provided to the public, both in urban and in rural areas. General Comment No2 to the UN CRPD further states that “The strict application of universal design to all new goods, products, facilities, technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity.”Given the ever-growing importance of digital services and platforms in private and public life, in line with the obligations enshrined in the UN CRPD, the EU must ensure a regulatory framework for digital services which protects rights of all recipients of services, including persons with disabilities.
2021/07/20
Committee: JURI
Amendment 101 #
Proposal for a regulation
Recital 5 a (new)
(5 a) Given the cross-border nature of the services at stake, EU action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and ensure that equal right to access and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for intermediary services, including their user interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals.
2021/07/20
Committee: JURI
Amendment 102 #
Proposal for a regulation
Recital 5 b (new)
(5 b) The notions of ‘access’ or ‘accessibility’ are often referred to with the meaning of affordability (financial access), availability, or in relation to access to data, use of network, etc. It is important to distinguish these from ‘accessibility for persons with disabilities’ which means that services, technologies and products are perceivable, operable, understandable and robust for persons with disabilities.
2021/07/20
Committee: JURI
Amendment 105 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, among others, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. To assist Member States and providers, the Commission should provide guidelines as to how to interpret the interaction between different Union acts and how to prevent any duplication of requirements on providers or potential conflicts in the interpretation of similar requirements. _________________ 28Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/07/20
Committee: JURI
Amendment 126 #
Proposal for a regulation
Recital 12 a (new)
(12 a) Material disseminated for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering illegal content including the content which represents an expression of polemic or controversial views in the course of public debate should not be considered as illegal content. Similarly, material, such as an eye-witness video of a potential crime, should not be considered as illegal, merely because it depicts an illegal act. An assessment should determine the true purpose of that dissemination and whether material is disseminated to the public for those purposes.
2021/07/20
Committee: JURI
Amendment 132 #
Proposal for a regulation
Recital 13 a (new)
(13 a) Additionally in order to avoid imposing obligations simultaneously on two providers for the same content, a hosting services should be defined as an online platform when it has a direct relationship with the recipient of the service. A hosting provider who is acting as the infrastructure for an online platform should not be considered as an online platform based on this relationship, where it implements the decisions of the online platform and its user indirectly.
2021/07/20
Committee: JURI
Amendment 165 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case, where set down in Union acts and, in particular, does not affect orders by national authorities in accordance with national legislation that implements European acts, in accordance with the conditions established in this Regulation and other European lex specialis. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Equally, nothing in this Regulation should prevent providers from enacting end-to-end encrypting of their services.
2021/07/20
Committee: JURI
Amendment 174 #
Proposal for a regulation
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the effective processing of those orders.
2021/07/20
Committee: JURI
Amendment 182 #
Proposal for a regulation
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders to provide information in question relate to specific items of illegal content and information as defined in Union or national law, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. Article 3 of Directive 2000/31/EC, however, continues to apply to any other orders related to non-specific individual items of illegal or legal content or information, general orders related to geoblocking of whole websites, webpages, or domains and any other matter which could be seen as restricting the freedom to provide their service across border.
2021/07/20
Committee: JURI
Amendment 191 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. At the same time, recipients should enter into such agreements willingly without any misleading or coercive tactics and therefore a ban on dark patterns should be introduced.
2021/07/20
Committee: JURI
Amendment 193 #
Proposal for a regulation
Recital 38 a (new)
(38 a) While an additional requirement should apply to very large online platforms, all providers should do a general self-assessment of potential risks related to their services, especially in relations with minors and should take voluntary mitigation measures where appropriate. In order to ensure that the provider undertakes these actions, Digital Services Coordinators may ask for proof.
2021/07/20
Committee: JURI
Amendment 194 #
Proposal for a regulation
Recital 4 a (new)
(4a) As Party to the United Nations Convention on the Rights of Persons with Disabilities (UN CRPD), provisions of the Convention are integral part of the Union legal order and binding upon the Union and its Member States. The UN CRPD requires its Parties to take appropriate measures to ensure that persons with disabilities have access, on an equal basis with others, to information and communications technologies and systems, and other facilities and services open or provided to the public, both in urban and in rural areas. General Comment No2 to the UN CRPD further states that “The strict application of universal design to all new goods, products, facilities, technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity.” Given the ever-growing importance of digital services and platforms in private and public life, in line with the obligations enshrined in the UN CRPD, the EU must ensure a regulatory framework for digital services which protects rights of all recipients of services, including persons with disabilities.
2021/07/08
Committee: IMCO
Amendment 199 #
Proposal for a regulation
Recital 5 a (new)
(5a) Given the cross-border nature of the services at stake, Union action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and ensure that equal right to access and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for intermediary services, including their user interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals.
2021/07/08
Committee: IMCO
Amendment 201 #
Proposal for a regulation
Recital 5 b (new)
(5b) The notions of ‘access’ or ‘accessibility’ are often referred to with the meaning of affordability (financial access), availability, or in relation to access to data, use of network, etc. It is important to distinguish these from ‘accessibility for persons with disabilities’ which means that services, technologies and products are perceivable, operable, understandable and robust for persons with disabilities.
2021/07/08
Committee: IMCO
Amendment 202 #
Proposal for a regulation
Recital 40 a (new)
(40 a) Notices should be directed to the actor that has the technical and operational ability to act and the closest relationship to the recipient of the service that provided the information or content, such as to an online platform and not to the hosting service provider on which provides services to that online platform. Such hosting service providers should redirect such notices to the particular online platform and inform the notifying party of this fact.
2021/07/20
Committee: JURI
Amendment 203 #
Proposal for a regulation
Recital 40 b (new)
(40 b) Hosting providers should seek to act only against the items of information notified. This may include acts such as disabling hyperlinking to the items of information. Where the removal or disabling of access to individual items of information is technically or operationally unachievable due to legal, contractual, or technological reasons, such as encrypted file and data storage and sharing services, hosting providers should inform the recipient of the service of the notification and seek action. If a recipient fails to act or delays action, or the provider has reason to believe has failed to act or otherwise acts in bad faith, the hosting provider may suspend their service in line with their terms and conditions.
2021/07/20
Committee: JURI
Amendment 204 #
Proposal for a regulation
Recital 7
(7) In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to providers of intermediary services irrespective of their place of establishment or residence, in so far as they provide and direct services at and in the Union, as evidenced by a substantial connection to the Union.
2021/07/08
Committee: IMCO
Amendment 206 #
Proposal for a regulation
Recital 41 a (new)
(41 a) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, either because it is illegal or is not allowed under its terms and conditions, it should do so in a timely manner, taking into account the potential harm of the infraction and the technical abilities of the provider. Information that could have a negative effect on minors, women and vulnerable users such as those with protected characteristics under Article 21 of the Charter should be seen as a matter requiring urgency
2021/07/20
Committee: JURI
Amendment 208 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. Such a statement, however, should not be required if it relates to spam, manifestly illegal content, removal of content similar or identical to content already removed from the same recipient, who has already received a statement or where a provider of hosting service does not have the information necessary to inform the recipient by a durable medium.
2021/07/20
Committee: JURI
Amendment 214 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, among others, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. To assist Member States and providers, the Commission should provide guidelines as to how to interpret the interaction between different Union acts and how to prevent any duplication of requirements on providers or potential conflicts in the interpretation of similar requirements. __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/07/08
Committee: IMCO
Amendment 219 #
Proposal for a regulation
Recital 10
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council.35 Directive (EU) 2019/882 of the European Parliament and of the Council, and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions. __________________ 30Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). 31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). 32Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37. 33Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. 34Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 35Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. 36Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. 37Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules 38Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
2021/07/08
Committee: IMCO
Amendment 220 #
Proposal for a regulation
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of entering, in good faith, an out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies located in either the Member State of the recipient or the provider and that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/07/20
Committee: JURI
Amendment 225 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should normally only be awarded to non-governmental entities, and not individualnatural persons, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities, however, can be public in nature for actions not related to intellectual property rights, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, non- governmental organisations of industry and of right- holders could also be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/20
Committee: JURI
Amendment 231 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and effective safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/20
Committee: JURI
Amendment 232 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a an imminent threat to the life or safety of person, notably when it concerns vulnerable users, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing upon request all relevant information available to it, including where relevant the content in question and an explanation of its suspicion and unless instructed otherwise, should remove or disable the content. Information obtained by a law enforcement or judicial authority of a Member State in accordance with this Article should not be used for any purpose other than those directly related to the individual serious criminal offence notified. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/07/20
Committee: JURI
Amendment 242 #
Proposal for a regulation
Recital 13 a (new)
(13a) Additionally in order to avoid imposing obligations simultaneously on two providers for the same content, a hosting service should only be deemed an online platform when it has a direct relationship with the recipient of the service. A hosting provider who is acting as the infrastructure for an online platform should not be considered as an online platform based on this relationship, where it implements the decisions of the online platform and its user indirectly.
2021/07/08
Committee: IMCO
Amendment 243 #
Proposal for a regulation
Recital 13 b (new)
(13b) For the purpose of this Regulation, a cloud computing service should not considered as an ‘online platform’ where allowing the dissemination of hyperlinks to a specific content is a minor and ancillary feature. Moreover a cloud computing service when serving as infrastructure, for example as the underlining infrastructural storage and computing services of an internet- based application or online platform, should not in itself be seen as disseminating to the public information stored or processed at the request of a recipient of an application or online platform which it hosts.
2021/07/08
Committee: IMCO
Amendment 247 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/20
Committee: JURI
Amendment 248 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outsideing within the scope of this Regulation should not be seen as disseminating to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Where multiple providers are involved in the dissemination of an information to the public, the obligations related to that disseminated should lay with the outward facing provider closest in relations to the accessibility by the end user recipient of the final service __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/08
Committee: IMCO
Amendment 258 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition or the misuse of the platforms' terms and conditions, including content moderation policies, when enforced, often through automatic means. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, fundamental rights, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/20
Committee: JURI
Amendment 260 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditionprevent the manipulation and exploitation of the service, including by the amplification of content which is counter to their terms and conditions, adapting their decision-making processes, or adapting their terms and conditions and content moderation policies and how those policies are enforced, while being fully transparent to the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources, including by displaying related public service advertisements instead of other commercial advertisements. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/20
Committee: JURI
Amendment 262 #
Proposal for a regulation
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. The mere ranking or displaying in an order, or the use of a recommender system should not, however, be deemed as having control over an information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
2021/07/08
Committee: IMCO
Amendment 269 #
Proposal for a regulation
Recital 21
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted. It also should not be understood to cover the ranking or sorting of information to make it accessible to a user or actions required to ensure the security of the transmissions.
2021/07/08
Committee: IMCO
Amendment 273 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. As long as providers act upon obtaining actual knowledge, providers should maintain the exemptions from liability referred to in article 3, 4, and 5, even when under taking voluntary own-initiative investigations or actions in line with Article 27.
2021/07/08
Committee: IMCO
Amendment 280 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. In addition, very large online platforms should label any known deep fake videos, audio or other files.
2021/07/19
Committee: JURI
Amendment 283 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, which mean the conditions set out in this Regulation. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/19
Committee: JURI
Amendment 285 #
Proposal for a regulation
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online marketplaces which are online platforms that allow consumers to conclude distance contracts with traders on the online platform itself, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platformmarketplaces present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. This may include the storage, packing and shipment of a good from a warehouse under the control of the online marketplace. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well- informed consumer.
2021/07/08
Committee: IMCO
Amendment 309 #
Proposal for a regulation
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be and among others, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, Virtual Private Networks, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
2021/07/08
Committee: IMCO
Amendment 311 #
Proposal for a regulation
Recital 27 a (new)
(27a) A single webpage or website may include elements that qualify differently between ‘mere conduit’, ‘caching’ or hosting services and the rules for exemptions from liability should apply to each accordingly. For example, a search engine may act solely as a ‘caching’ service as to information included in the results of an inquiry. Elements displayed alongside those results, such as online advertisements, would however still meet the standard of a hosting service.
2021/07/08
Committee: IMCO
Amendment 312 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case, where set down in Union acts and, in particular, does not affect orders by national authorities in accordance with national legislation that implements Union acts, in accordance with the conditions established in this Regulation and other Union law regarded as lex specialis. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Equally, nothing in this Regulation should prevent providers from enacting end-to-end encrypting of their services.
2021/07/08
Committee: IMCO
Amendment 314 #
Proposal for a regulation
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV and Article 8 and 9 by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.
2021/07/19
Committee: JURI
Amendment 317 #
Proposal for a regulation
Recital 78
(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. In order to ensure coherence between the Member States, the Commission should adopt guidance on the procedures and rules related to the powers of Digital Services Coordinators.
2021/07/19
Committee: JURI
Amendment 322 #
Proposal for a regulation
Recital 88
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level and with legal personality, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member Statethe Member State´s Digital Services Coordinator.
2021/07/19
Committee: JURI
Amendment 324 #
Proposal for a regulation
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the effective processing of those orders.
2021/07/08
Committee: IMCO
Amendment 336 #
Proposal for a regulation
Recital 97 a (new)
(97 a) The Commission should ensure that it is independent and impartial in its decision making in regards to both Digital Services Coordinators and providers of services under this Regulation.
2021/07/19
Committee: JURI
Amendment 338 #
Proposal for a regulation
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information as defined in Union or national law, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. Article 3 of Directive 200/31/EC, however, continues to apply to any other orders related to non-specific individual items of illegal or legal content or information, general orders related to geoblocking of whole websites, webpages, or domains and any other matter which could be seen as restricting the freedom to provide their service across border.
2021/07/08
Committee: IMCO
Amendment 341 #
Proposal for a regulation
Recital 99
(99) In particular, the Commission, where it can show grounds for believing that a very large online platform is not compliant with this Regulation, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information related to those concerns. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers.
2021/07/19
Committee: JURI
Amendment 344 #
Proposal for a regulation
Recital 104
(104) In order to fulfil the objectives of this Regulation, the power to adopt acts in accordance with Article 290 of the Treaty should be delegated to the Commission to supplement this Regulation. In particular, delegated acts should be adopted in respect of criteria for identification of very large online platforms and of technical specifications for access requests. It is also equally important that, when standardisation bodies are unable to agree the standards needed to implement this Regulation fully, the Commission chooses to adopt delegated acts. It is of particular importance that the Commission carries out appropriate consultations and that those consultations be conducted in accordance with the principles laid down in the Interinstitutional Agreement on Better Law-Making of 13 April 2016. In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States' experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts.
2021/07/19
Committee: JURI
Amendment 348 #
1. This Regulation lays down harmonised rules on the provision of intermediary services in the internal marketorder to improve the functioning of the internal market whilst ensuring the rights enshrined in the Charter of Fundamental Rights of the European Union, in particular the freedom of expression and information in an open and democratic society. In particular, it establishes:
2021/07/19
Committee: JURI
Amendment 351 #
Proposal for a regulation
Recital 35
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation in relations to those services. Services that do not fall within those different categories should not be effected, even when provided by the same provider or under the same ownership structure. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.
2021/07/08
Committee: IMCO
Amendment 351 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform harmonised rules for a safe, predictable, accessible and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/07/19
Committee: JURI
Amendment 354 #
Proposal for a regulation
Recital 35 a (new)
(35a) Similarly, in order to ensure that the obligations are only applied to those providers of intermediary services where the benefit would outweigh the burden on the provider, the Commission should be empowered to issue a waiver to the requirements of Chapter III, in whole or in parts, to those providers of intermediary services that are non-for- profit or equivalent and serve a manifestly positive role in the public interest, or are SMEs without any systemic risk related to illegal content. The providers should present justified reasons for why they should be issued a waiver. The Commission should examine such an application and has the authority to issue or revoke a waiver at any time. The Commission should maintain a public list of all waiver issued and their conditions containing a description on why the provider is justified a waiver.
2021/07/08
Committee: IMCO
Amendment 355 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. This contact point maybe the same contact point as required under other Union acts. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
2021/07/08
Committee: IMCO
Amendment 355 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
4 a. This Regulation shall respect the fundamental rights recognised by the Charter of Fundamental rights of the European Union and the fundamental rights constituting general principles of Union law. Accordingly, this Regulation may only be interpreted and applied in accordance with those fundamental rights, including the freedom of expression and information, as well as the freedom and pluralism of the media. When exercising the powers set out in this Regulation, all public authorities involved shall aim to achieve, in situations where the relevant fundamental rights conflict, a fair balance between the rights concerned, in accordance with the principle of proportionality.
2021/07/19
Committee: JURI
Amendment 358 #
Proposal for a regulation
Article 1 – paragraph 5 – point b
(b) Directive 2010/13/ECU as amended by Directive 2018/1808/EU;
2021/07/19
Committee: JURI
Amendment 360 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 on Copyright and Related Rights in Digital Single Market;
2021/07/19
Committee: JURI
Amendment 363 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. At the same time, recipients should enter into such agreements willingly without any misleading or coercive tactics and therefore a ban on dark patterns should be introduced.
2021/07/08
Committee: IMCO
Amendment 363 #
Proposal for a regulation
Article 1 – paragraph 5 – point h
(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation (EU) 2019/1020 and Regulation XXX (General Product Safety Regulation);
2021/07/19
Committee: JURI
Amendment 364 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
(i a) Directive (EU) 2019/882
2021/07/19
Committee: JURI
Amendment 366 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
5 a. The Commission shall by [within one year of the adoption of this Regulation] publish guidelines with regard to the relations between this Regulation and legislative acts listed in Article 1(5). These guidelines shall clarify any potential conflicts between the conditions and obligations listed in those legislative acts and which act prevails where actions, in line with this Regulation, fulfil the obligations of another legislative act and which regulatory authority is competent.
2021/07/19
Committee: JURI
Amendment 368 #
Proposal for a regulation
Recital 38 a (new)
(38a) While an additional requirement should apply to very large online platform, all providers should do a general self-assessment of potential risk related to their services, especially in relations with minors and should take voluntary mitigation measures where appropriate. In order to ensure that the provider undertakes these actions, Digital Services Coordinators may ask for proof.
2021/07/08
Committee: IMCO
Amendment 369 #
Proposal for a regulation
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 which do not also qualify as very large online platforms. In any public versions of such reports, providers of intermediary services should remove any information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions. __________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/08
Committee: IMCO
Amendment 382 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law or due to its connection to or promotion of an illegal activity, including the sale of products, substances, animals or plants, or provision of services, directly leads to the dissemination to the public of such an illegal content. Material disseminated for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering illegal content including the content which represents an expression of polemic or controversial views in the course of public debate shall not be considered as illegal content. An assessment shall determine the true purpose of that dissemination and whether material is disseminated to the public for those purposes.
2021/07/19
Committee: JURI
Amendment 383 #
Proposal for a regulation
Recital 40 a (new)
(40a) Nevertheless, notices should be directed to the actor that has the technical and operational ability to act and the closest relationship to the recipient of the service that provided the information or content, such as to an online platform and not to the hosting service provider on which provides services to that online platform. Such hosting service providers should redirect such notices to the particular online platform and inform the notifying party of this fact.
2021/07/08
Committee: IMCO
Amendment 384 #
Proposal for a regulation
Recital 40 b (new)
(40b) Moreover, hosting providers should seek to act only against the items of information notified. This may include acts such as disabling hyperlinking to the items of information. Where the removal or disabling of access to individual items of information is technically or operationally unachievable due to legal, contractual, or technological reasons, such as encrypted file and data storage and sharing services, hosting providers should inform the recipient of the service of the notification and seek action. If a recipient fails to act or delays action, or the provider has reason to believe has failed to act or otherwise acts in bad faith, the hosting provider may suspend their service in line with their terms and conditions.
2021/07/08
Committee: IMCO
Amendment 387 #
Proposal for a regulation
Recital 41 a (new)
(41a) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, either because it is illegal or is not allowed under its terms and conditions, it should do so in a timely manner, taking into account the potential harm the infraction and the technical abilities of the provider. Information that could have a negative effect on minors, women and vulnerable users such as those with protected characteristics under Article 21 of the Charter should be seen as a matter requiring urgency.
2021/07/08
Committee: IMCO
Amendment 388 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that have been proven to be efficient, proportionate and reliable, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. Such an statement, however, should not be required if it relates to spam, manifestly illegal content, removal of content similar or identical to content already removed from the same recipient who has already received a statement or where a provider of hosting service does not have the information necessary to inform the recipient by a durable medium.
2021/07/08
Committee: IMCO
Amendment 398 #
Proposal for a regulation
Recital 42 a (new)
(42a) Due to the international nature of many providers of hosting services, many have already implemented similar requirements under the laws of third- party countries. In order to prevent a doubling of requirements and the removal of existing systems for recipients, the Commission should be empowered to declare these mechanisms as ensuring an adequate level of protection and fulfilling the requirements in Article 14 and Article 15.
2021/07/08
Committee: IMCO
Amendment 404 #
Proposal for a regulation
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of entering, in good faith, an out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies located in either the Member State of the recipient or the provider and that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/07/08
Committee: IMCO
Amendment 406 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(q a) ‘dark pattern’ means a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision- making or choice.
2021/07/19
Committee: JURI
Amendment 408 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(q b) ‘deep fake’ means a generated or manipulated image, audio or video content that appreciably resembles existing persons,objects, places or other entities or events and falsely appears to a person to be authentic or truthful;
2021/07/19
Committee: JURI
Amendment 409 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(q b) 'minor' means a child below the age of 16, as established in Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 410 #
Proposal for a regulation
Article 2 – paragraph 1 – point q c (new)
(q c) ‘persons with disabilities’ means persons with disabilities within the meaning of Article 3(1) of Directive (EU) 2019/882
2021/07/19
Committee: JURI
Amendment 412 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should normally only be awarded to non-governmental entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities, however, can be public in nature for actions not related to intellectual property rights, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, non- governmental organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/08
Committee: IMCO
Amendment 422 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and effective safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/08
Committee: IMCO
Amendment 426 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a an imminent threat to the life or safety of person, notably when it concerns vulnerable users, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing upon request all relevant information available to it, including where relevant the content in question and an explanation of its suspicion and unless instructed otherwise, should remove or disable the content. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/07/08
Committee: IMCO
Amendment 435 #
Proposal for a regulation
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with tradermarketplaces should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platformmarketplaces should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary and no longer than six months after the end of a relationship with the trader, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate direct interest, including through the orders to provide information referred to in this Regulation.
2021/07/08
Committee: IMCO
Amendment 440 #
Proposal for a regulation
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. This Regulation shall not prevent providers from offering end- to-end encrypted services. The provision of such services shall not constitute a reason for liability or for becoming ineligible for the exemptions from liability.
2021/07/19
Committee: JURI
Amendment 447 #
Proposal for a regulation
Article 8 – paragraph 1 a (new)
1 a. If the provider cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that has issued the order.
2021/07/19
Committee: JURI
Amendment 448 #
Proposal for a regulation
Recital 50 a (new)
(50a) The online interface of online marketplace should allow traders to provide the information referred to in Article 22a of this Regulation and any other information where needed and necessary to allow for the unequivocal identification of the product or the service, including labelling requirements, in compliance with legislation on product safety and product compliance. Providers of online marketplaces, when they become aware that a product or services is illegal, should inform recipients who have acquired the product or services through their marketplace of this fact and any possible redress.
2021/07/08
Committee: IMCO
Amendment 453 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
- the identification of the issuing authority and the means to verify the authentication of the order;
2021/07/19
Committee: JURI
Amendment 456 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/08
Committee: IMCO
Amendment 459 #
Proposal for a regulation
Article 8 – paragraph 2 – point b
(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective and in any case does not exceed the territory of the Member State of the order;
2021/07/19
Committee: JURI
Amendment 464 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider.
2021/07/19
Committee: JURI
Amendment 467 #
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
(c a) the order is issued only where no other effective means are available to bring about the cessation or the prohibition of the infringement
2021/07/19
Committee: JURI
Amendment 468 #
Proposal for a regulation
Article 8 – paragraph 2 – point c b (new)
(c b) where more than one provider of intermediary services is responsible for hosting the specific item, the order is issued to the most appropriate provider that has the technical and operational ability to act against the specific item.
2021/07/19
Committee: JURI
Amendment 469 #
Proposal for a regulation
Article 8 – paragraph 2 a (new)
2 a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders.
2021/07/19
Committee: JURI
Amendment 470 #
Proposal for a regulation
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The determination of this operational threshold, therefore, should only take into those recipients which are physical persons residing in the Union or physical persons acting on behalf of a legal person established in the Union. Automated bots, fake accounts, indirect hyperlinking, FTP or other indirect downloading of content should not be included in the determination of this threshold being exceed. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
2021/07/08
Committee: IMCO
Amendment 470 #
Proposal for a regulation
Article 8 – paragraph 2 b (new)
2 b. Member States shall ensure that providers have a right to appeal and object to implementing the order and shall facilitate the use and access to that right.
2021/07/19
Committee: JURI
Amendment 471 #
Proposal for a regulation
Article 8 – paragraph 2 c (new)
2 c. When an order to act against a specific individual item of illegal content is issued by a relevant national judicial or administrative authority, Member States shall ensure that the relevant national judicial or administrative authority duly informs the Digital Services Coordinator from the Member State of the judicial or administrative authority.
2021/07/19
Committee: JURI
Amendment 474 #
Proposal for a regulation
Article 8 – paragraph 3 – subparagraph 1 (new)
Where upon receiving the copy of the order, at least three Digital Services Coordinators consider that the order violates Union or national law that is in conformity with Union law, including the Charter, they can object the enforcement of the order to the Board, based on a reasoned statement. Following recommendation of the Board, the Commission may decide whether the order is to be enforced.
2021/07/19
Committee: JURI
Amendment 477 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition or misuse the way platforms' terms and conditions, including content moderation policies, are enforced, often through automatic means. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, fundamental rights, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 478 #
Proposal for a regulation
Article 8 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative law in conformity with Union law, including the Charter of Fundamental Rights. While acting in accordance with such laws, authorities shall not go beyond what is necessary in order to attain the objectives followed therein.
2021/07/19
Committee: JURI
Amendment 482 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditionprevent the manipulation and exploitation of the service, including by the amplification of content which is counter to their terms and conditions, adapting their decision-making processes, or adapting their terms and conditions and content moderation policies and how those policies are enforced, while being fully transparent to the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources, including by displaying related public service advertisements instead of other commercial advertisements. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/08
Committee: IMCO
Amendment 488 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
2021/07/19
Committee: JURI
Amendment 489 #
Proposal for a regulation
Article 9 – paragraph 1 a (new)
1 a. If the provider cannot comply with the information order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that issued the information order
2021/07/19
Committee: JURI
Amendment 495 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
-1 the identification of the issuing authority and the means to verify the authentication of the order;
2021/07/19
Committee: JURI
Amendment 499 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. In addition, very large online platforms should label any known deep fake videos, audio or other files.
2021/07/08
Committee: IMCO
Amendment 504 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, which mean the conditions set down in this Regulation. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/08
Committee: IMCO
Amendment 505 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider;
2021/07/19
Committee: JURI
Amendment 506 #
Proposal for a regulation
Article 9 – paragraph 2 – point c a (new)
(c a) the order is issued only where no other effective means are available to receive the same specific item of information
2021/07/19
Committee: JURI
Amendment 507 #
Proposal for a regulation
Article 9 – paragraph 2 a (new)
2 a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders. It shall ensure that form means the standards set down in the Annex of [XXX the regulation on European Production and Preservation Orders for electronic evidence in criminal matters].
2021/07/19
Committee: JURI
Amendment 508 #
Proposal for a regulation
Article 9 – paragraph 2 b (new)
2 b. When an order to provide a specific item of information about one or more specific individual recipients of the service is issued by a relevant national judicial or administrative authority, Member States shall ensure that the relevant national judicial or administrative authority duly informs the Digital Services Coordinator from the Member State of the judicial or administrative authority.
2021/07/19
Committee: JURI
Amendment 511 #
Proposal for a regulation
Recital 66
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. However, where no voluntary industry standard is agreed and the Commission finds that the application of this Regulation by providers is significantly divergent, the Commission should be empowered to adopt delegated acts where needed until a voluntary industry standard is agreed.
2021/07/08
Committee: IMCO
Amendment 512 #
Proposal for a regulation
Article 9 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative law in conformity with Union law.
2021/07/19
Committee: JURI
Amendment 519 #
Proposal for a regulation
Article -10 (new)
Article -10 Waiver 1. Providers of intermediary services may apply to the Commission for a waiver from the requirements of Chapter III, if they prove that they are: (a) non-for-profit or equivalent and serve a manifestly positive role in the public interest; (b) micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC; or (c) a medium enterprises within the meaning of the Annex to Recommendation 2003/361/EC without any systemic risk related to illegal content. The Providers shall present justified reasons for their request. 2. The Commission shall examine such an application and, after consulting the Board, may issue a waiver in whole or in parts to the requirements of this Chapter. 3. Upon the request of the Board or the provider, or on its own initiative, the Commission may review a waiver issued and revoke the waiver in whole or in parts. 4. The Commission shall maintain a list of all waivers issued and their conditions and shall publish this list to the public.
2021/07/19
Committee: JURI
Amendment 521 #
Proposal for a regulation
Article -10 a (new)
Article -10 a Conflict between Union Acts 1. Where any obligation set down in this Regulation can be viewed as equivalent with or superseded by an obligation within another Union act, in which a provider of intermediary services is also a subject, a provider of intermediary services may apply to the Commission for a waiver from such requirements or a declaration that it should be deemed as having complied with this Regulation, in whole or in parts. The provider shall present justified reasons for their request. 2. The Commission shall examine such an application and, after consulting the Board, may issue a waiver or declaration in whole or in parts to the requirements of this Regulation. 3. Upon the request of the Board or on its own initiative, the Commission may review a waiver or declaration issued and revoke the waiver or declaration in whole or in parts. 4. The Commission shall maintain a list of all waivers and declarations issued and their conditions and shall publish this list to the public.
2021/07/19
Committee: JURI
Amendment 523 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
2 a. Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision.
2021/07/19
Committee: JURI
Amendment 528 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, since the Commission willhas issued guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan in May 2021.
2021/07/08
Committee: IMCO
Amendment 541 #
Proposal for a regulation
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV and Article 8 and 9 by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.
2021/07/08
Committee: IMCO
Amendment 542 #
Proposal for a regulation
Recital 78
(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. In order to ensure coherence between the Member States, the Commission should adopt guidance on the procedures and rules related to the powers of Digital Services Coordinators.
2021/07/08
Committee: IMCO
Amendment 546 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Providers of intermediary services shall, when complying with the requirements of this Article, not be required to disclose algorithms or any information that, with reasonable certainty, would result in the enabling of deception of consumers or consumer harm through the manipulation of their services. This Article shall be without prejudice to Directive (EU) 2016/943.
2021/07/19
Committee: JURI
Amendment 551 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2b. Providers of intermediary services shall refrain from any dark patterns or other techniques to encourage the acceptance of terms and conditions, including giving consent to sharing personal and non-personal data.
2021/07/19
Committee: JURI
Amendment 554 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall not require recipients of the service other than traders to make their legal identity public in order to use the service.
2021/07/19
Committee: JURI
Amendment 556 #
Proposal for a regulation
Article 12 – paragraph 2 d (new)
2d. For providers other than very large online platforms, nothing in this Regulation shall prevent a provider of intermediary services from terminating the contractual relationship with its recipients without clause, in the situations provided for in the terms and conditions. Providers of a very large online platform shall issue a statement for the termination to the recipient, and the recipient shall have access to the internal complaint mechanism under Article 17 and the out- of-court mechanism under Article 18.
2021/07/19
Committee: JURI
Amendment 557 #
Proposal for a regulation
Recital 88
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level and with legal personality, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member Statethe Member State´s Digital Services Coordinator.
2021/07/08
Committee: IMCO
Amendment 558 #
Proposal for a regulation
Article 12 a (new)
Article 12a Fair consent choice screens 1. Providers of intermediary services that ask the recipients of their service for consent as required by Regulation (EU) 2016/679 to collect or process personal data concerning them shall ensure that the end user choice screens shown to that end are designed in a fair and neutral manner and do not in any way subvert or impair user autonomy, decision-making, or choice via the choice screens’ structure, function or manner of operation. In particular, providers shall refrain from: (a) giving more visual prominence to any of the consent options when asking the recipient of the service for a decision; (b) repeatedly requesting that a recipient of the service consents to data processing, regardless of the scope of purpose of such processing, especially by presenting a pop-up that interferes with user experience; (c) urging a recipient of the service to change any setting or configuration of the service after the person in question has already made her choice, including by the use of a technical standard in accordance with paragraph 3; (d) making the procedure of cancelling a service more cumbersome then signing up to it. 2. The Commission may adopt implementing acts to prescribe binding design aspects and functions of consent choice screens that fulfil the requirements of paragraph 1. 3. Providers of intermediary services shall accept the communication of consent choices made by the recipient of the service through automated means, including through standardised digital signals sent by the recipient’s software used to access the service such as web browsers and operating systems. 4. The Commission shall promote and facilitate the development of technical standards for the automated communication of consent choices through international and Union standardisation bodies. Where standardisation bodies fail to develop a workable technical standard, the Commission shall, not later than two years after entry into force of this Regulation, designate a binding technical standard for the purpose of paragraph 3.
2021/07/19
Committee: JURI
Amendment 559 #
Proposal for a regulation
Article 12 a (new)
Article 12a General Risk Assessment and Mitigation Measures 1. Providers of intermediary services shall identify, analyse and assess, at least once a year and at each significant revision of a service they provide thereafter, the potential misuse or other risks stemming from the functioning and use made of their services in the Union. Such a general risk assessment shall be specific to each of their services and shall include at least risks related to the dissemination of illegal content through their services and any contents that might have a negative effect on potential recipients of the service, especially minors and gender equality. 2. Providers of intermediary services shall wherever possible, attempt to put in place reasonable, proportionate and effective mitigation measures to the risk identified in line with applicable law and their terms and conditions. 3. Where the identified risk relations to minor recipients of the service, without regard to if the minor is acting with respect to the terms and conditions, mitigation measures shall include, where needed and applicable: (a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure those prioritise the best interests of the minor; (b) adapting or removing system design features that expose or promote to minors to content, contact, conduct and contract risks that impair the physical, mental or moral development; (c) ensuring the highest levels of privacy, safety, consumer protection and security by design and default for individual recipients of the service under the age of 18. (d) if a service is targeted at minors, provide child-friendly mechanisms for remedy and redress, including easy access to expert advice and support. 4. Providers of intermediary services shall, upon request, explain to the competent Digital Services Coordinator, how it undertook this risk assessment and what mitigation measures it undertook.
2021/07/19
Committee: JURI
Amendment 560 #
Proposal for a regulation
Article 12 b (new)
Article 12b Fair consent choice screens 1. Providers of intermediary services that ask the recipients of their service for consent as required by Regulation (EU) 2016/679 to process personal data concerning them shall ensure that the end user choice screens shown to that end are designed in a fair and neutral manner and do not in any way subvert or impair user autonomy, decision-making, or choice via the choice screens’ structure, function or manner of operation. 2. A choice or decision made by the recipient of the service using an online interface or part thereof that does not comply with the requirements of paragraph 1 shall not constitute consent in the sense of Regulation (EU) 2016/679. 3. Paragraphs 1 and 2 shall also apply to consent given prior to the entry into force of this Regulation 4. The Commission may adopt implementing acts to prescribe binding design aspects and functions of consent choice screens that fulfil the requirements of paragraph 1. 5. Providers of intermediary services shall accept the communication of consent choices made by the recipient of the service through automated means, including through standardised digital signals sent by the recipient’s software used to access the service such as web browsers and operating systems. 6. Providers of intermediary services shall respect the communication of choices made by the recipients of the service, including consent or withdrawal of consent to the processing of personal data, through automated means, such as through the settings of software placed on the market permitting electronic communications, including the retrieval and presentation of information on the internet. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions for automated means referred to above. 7. The Board, in cooperation with the Commission, shall publish official guidelines to indicate specific design patterns that qualify as subverting or impairing the autonomy, decision making, or choice of the recipients of the service. The Board shall keep this list updated in the light of technological developments and, in the case of very large online platforms, assessments related to systemic risks identified in accordance with Article 27(2). 8. The Commission may adopt implementing acts to prescribe the design and functions of online interfaces that facilitate expression of consent in the sense of Regulation (EU) 2016/679 or other choices that may be expressed by the recipients of the service. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70. Before the adoption of any measures pursuant to this paragraph, the Commission shall publish a draft thereof and invite all interested parties to submit their comments within the time period set out therein, which shall not be less than two months.
2021/07/19
Committee: JURI
Amendment 563 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed to inform taking the action specified in thoshe authority issuing the order of its receipt and the effect given to the orders;
2021/07/19
Committee: JURI
Amendment 578 #
Proposal for a regulation
Recital 97 a (new)
(97a) The Commission should ensure that it is independent and impartial in its decision making in regards to both Digital Services Coordinators and providers of services under this Regulation.
2021/07/08
Committee: IMCO
Amendment 579 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
2a. Where made available to the public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider's terms and conditions.
2021/07/19
Committee: JURI
Amendment 585 #
Proposal for a regulation
Recital 99
(99) In particular, the Commission, where it can show grounds for believing that a very large online platform is not compliant with this Regulation, should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information related to those concerns. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers.
2021/07/08
Committee: IMCO
Amendment 592 #
Proposal for a regulation
Recital 102
(102) In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within five years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure. In addition, the Commission should carry out an assessment of any impact of the costs to European service providers of any similar requirements, including those of Article 11, introduced by third-party states and any new barriers to non-EU market access after the adoption of this Regulation. The Commission should also access the impact on the ability of European businesses and consumers to access and buy products and services from outside the Union.
2021/07/08
Committee: IMCO
Amendment 594 #
Proposal for a regulation
Recital 104
(104) In order to fulfil the objectives of this Regulation, the power to adopt acts in accordance with Article 290 of the Treaty should be delegated to the Commission to supplement this Regulation. In particular, delegated acts should be adopted in respect of criteria for identification of very large online platforms and of technical specifications for access requests. It is also equally important that when standardisation bodies are unable to agree the standards needed to implement this Regulation fully, that the Commission to choice to adopt delegated acts. It is of particular importance that the Commission carries out appropriate consultations and that those consultations be conducted in accordance with the principles laid down in the Interinstitutional Agreement on Better Law-Making of 13 April 2016. In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States' experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts.
2021/07/08
Committee: IMCO
Amendment 595 #
Proposal for a regulation
Recital 105 a (new)
(105a) This Regulation serves a horizontal framework to ensure the further strengthening and deepening the Digital Single Market and the internal market and therefore seeks to lay down rules and obligations which, unless specified, seek to be applicable to all providers without regards to individual models of operation. Individual models of operation are often addressed in different Union law regarded as lex specialis. In the case of any potential conflict between this Regulation and those Union acts, the principle of Lex specialis derogat legi generali should apply.
2021/07/08
Committee: IMCO
Amendment 608 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform harmonised rules for a safe, predictable, accessible and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/07/08
Committee: IMCO
Amendment 622 #
Proposal for a regulation
Article 1 – paragraph 3
3. This Regulation shall apply to intermediary services directed at and provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services.
2021/07/08
Committee: IMCO
Amendment 625 #
Proposal for a regulation
Article 1 – paragraph 5 – point b
(b) Directive 2010/13/ECU as amended by Directive 2018/1808/EU;
2021/07/08
Committee: IMCO
Amendment 631 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 on Copyright and Related Rights in Digital Single Market;
2021/07/08
Committee: IMCO
Amendment 634 #
Proposal for a regulation
Article 1 – paragraph 5 – point h
(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation (EU) 2019/1020 and Regulation XXX (General Product Safety Regulation);
2021/07/08
Committee: IMCO
Amendment 638 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU) 2019/882
2021/07/08
Committee: IMCO
Amendment 642 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
5a. The Commission shall by [within one year of the adoption of this Regulation] publish guidelines with regards to the relations between this Regulation and those legislative acts listed in Article 1(5). These guidelines shall clarify any potential conflicts between the conditions and obligations enlisted in these legislative acts and which act prevails where actions, in line with this Regulation, fulfil the obligations of another legislative act and which regulatory authority is competent.
2021/07/08
Committee: IMCO
Amendment 647 #
Proposal for a regulation
Article 2 – paragraph 1 – point b
(b) ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary servic, for professional reasons or otherwise, uses the relevant intermediary service in particular for the purposes of seeking information or making it accessible;
2021/07/08
Committee: IMCO
Amendment 650 #
Proposal for a regulation
Article 2 – paragraph 1 – point c
(c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;
2021/07/08
Committee: IMCO
Amendment 654 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions whether or not to remove or disable access to or restrict visibility of the information;
2021/07/19
Committee: JURI
Amendment 662 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) decisions whether or not to restrict the ability to monetize content provided by the recipients;
2021/07/19
Committee: JURI
Amendment 664 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
— the targeproactive directing of activities towards one or more Member States.
2021/07/08
Committee: IMCO
Amendment 667 #
Proposal for a regulation
Article 2 – paragraph 1 – point e
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession, irrespective of the legality of those actions;
2021/07/08
Committee: IMCO
Amendment 671 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – introductory part
(f) ‘intermediary service’ means one of the following information society services:
2021/07/08
Committee: IMCO
Amendment 673 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 1
— a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, including technical auxiliary functional services;
2021/07/08
Committee: IMCO
Amendment 682 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
The first subparagraph is without prejudice to the right of the recipient or individuals or entities that have submitted notices, concerned to redress against the decision before a court in accordance with the applicable law.
2021/07/19
Committee: JURI
Amendment 692 #
Proposal for a regulation
Article 18 – paragraph 2 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
2021/07/19
Committee: JURI
Amendment 695 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, with which it has a direct relationship stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. For the purpose of this Regulation, cloud computing service shall not be considered to be an online platform in cases where allowing the dissemination of hyperlinks to a specific content constitutes a minor and ancillary feature.
2021/07/08
Committee: IMCO
Amendment 705 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(ha) ‘cloud computing service’ means a digital service that enables access to a scalable and elastic pool of shareable computing resources;
2021/07/08
Committee: IMCO
Amendment 714 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
(ia) ‘online marketplace’ means an online platform which allows consumers to conclude distance contracts with traders on its platform;
2021/07/08
Committee: IMCO
Amendment 716 #
Proposal for a regulation
Article 2 – paragraph 1 – point k a (new)
(ka) ‘trusted flagger’ means an entity that has been nominated by a Digital Services Coordinator based on specific conditions to be authorised to issue priority notifications as to illegal content found on a platform.
2021/07/08
Committee: IMCO
Amendment 718 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether the person is incorporated or unincorporated and irrespective of whether the information is designed to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface normally against remuneration specifically for promoting that informationmessage;
2021/07/08
Committee: IMCO
Amendment 719 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform;deleted
2021/07/19
Committee: JURI
Amendment 724 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities, either through automated or manual means, undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, monetisation and accessibility of that illegal content or that information, such as demotion, disabling of access to, delisting, demonetisation or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
2021/07/08
Committee: IMCO
Amendment 726 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
(ca) it has a transparent funding structure, including publishing the sources and amounts of all revenue annually
2021/07/19
Committee: JURI
Amendment 727 #
Proposal for a regulation
Article 19 – paragraph 2 – point c b (new)
(cb) it is not already a trusted flagger in another Member State.
2021/07/19
Committee: JURI
Amendment 728 #
Proposal for a regulation
Article 19 – paragraph 2 – point c c (new)
(cc) it publishes, at least once a year, clear, easily comprehensible and detailed reports on any notices submitted in accordance with Article 14 during the relevant period. The report shall list notices categorised by the identity of the hosting service provider, the type of alleged illegal or terms and conditions violating content concerned, and what action was taken by the provider. In addition, the reports hall identify relationships between the trusted flagger and any online platform, law enforcement, or other government or relevant commercial entity, and explain the means by which the trusted flagger maintains its independence.
2021/07/19
Committee: JURI
Amendment 738 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) ‘dark pattern’ means a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision- making or choice.
2021/07/08
Committee: IMCO
Amendment 742 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(qb) ‘persons with disabilities’ means persons with disabilities within the meaning of Article 3(1) of Directive (EU) 2019/882
2021/07/08
Committee: IMCO
Amendment 744 #
Proposal for a regulation
Article 19 a (new)
Article 19a Accessibility requirements for online platforms 1. Providers of online platforms which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882. 2. Providers of online platforms shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in written and oral format, including in a manner which is accessible to persons with disabilities. Providers of online platforms shall keep that information for as long as the service is in operation. 3. Providers of online platforms shall ensure that information, forms and measures provided pursuant to this Regulation are made available in a manner that they are easy to find and accessible to persons with disabilities. 4. Providers of online platforms which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services. 5. In the case of non-conformity, providers of online platforms shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. 6. Provider of online platforms shall, further to a reasoned request from a competent authority, provide it with all information necessary to demonstrate the conformity of the service with the applicable accessibility requirements. They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements. 7. Online platforms which are in conformity with harmonised standards or parts thereof the references of which have been published in the Official Journal of the European Union, shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those standards or parts thereof cover those requirements. 8. Online platforms which are inconformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements.
2021/07/19
Committee: JURI
Amendment 745 #
Proposal for a regulation
Article 2 – paragraph 1 – point q c (new)
(qc) ‘deep fake’ means a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful.
2021/07/08
Committee: IMCO
Amendment 752 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
1. Where an information society 1. service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient or more secure the information's onward transmission to other recipients of the service upon their request, on condition that:
2021/07/08
Committee: IMCO
Amendment 753 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
(a) the provider does not modify the finformational content;
2021/07/08
Committee: IMCO
Amendment 761 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
3a. Suspensions referred to in paragraphs 1 and 2 may be declared permanent where (a) compelling reasons of law or public policy, including ongoing criminal investigations, justify avoiding or postponing notice to the recipient; (b) the items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts; or (c) the items removed were related to content covered by [Directive 2011/93/EU updated reference] or [Directive (EU) 2017/541 XXX New Ref to TCO Regulation].
2021/07/19
Committee: JURI
Amendment 770 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with tradermarketplaces, where such an online platformmarketplace presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
2021/07/08
Committee: IMCO
Amendment 773 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
2a. Unless instructed otherwise by the informed authority, the provider shall remove or disable the content. It shall store all content and related data for at least six months.
2021/07/19
Committee: JURI
Amendment 774 #
Proposal for a regulation
Article 21 – paragraph 2 b (new)
2b. Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified.
2021/07/19
Committee: JURI
Amendment 775 #
Proposal for a regulation
Article 21 – paragraph 2 c (new)
2c. The Commission shall adopt an implementing act setting down a template for notifications under paragraph 1.
2021/07/19
Committee: JURI
Amendment 776 #
Proposal for a regulation
Article 21 – paragraph 2 d (new)
2d. Where a notification of suspicions of criminal offences includes information which may be seen as potential electronic information in criminal proceedings, Regulation XXX [E-evidence] shall apply.
2021/07/19
Committee: JURI
Amendment 783 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bankpayment account details of the trader, where the trader is a natural person;
2021/07/19
Committee: JURI
Amendment 784 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning ofestablished in the Union and carrying out the tasks in accordance with Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or [Article XX of the General Product Safety Regulation], or any relevant act of Union law; _________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
2021/07/19
Committee: JURI
Amendment 793 #
Proposal for a regulation
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. This Regulation shall not prevent providers from offering end- to-end encrypted services. The provision of such services shall not constitute a reason for liability or for becoming ineligible for the exemptions from liability.
2021/07/08
Committee: IMCO
Amendment 806 #
Proposal for a regulation
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific individual item of illegal content, received from and issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
2021/07/08
Committee: IMCO
Amendment 809 #
Proposal for a regulation
Article 8 – paragraph 1 a (new)
1a. If the provider cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that has issued the order.
2021/07/08
Committee: IMCO
Amendment 819 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
— the identification of the issuing authority and the means to verify the authentication of the order;
2021/07/08
Committee: IMCO
Amendment 820 #
Proposal for a regulation
Article 23 – paragraph 4 a (new)
4a. Where published to the general public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions.
2021/07/19
Committee: JURI
Amendment 824 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Online platforms that directly and indirectly display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, meaningful, salient, uniform and unambiguous manner and in real time:
2021/07/19
Committee: JURI
Amendment 828 #
Proposal for a regulation
Article 8 – paragraph 2 – point b
(b) the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective and in any case does not exceed the territory of the Member State of the order;
2021/07/08
Committee: IMCO
Amendment 828 #
(c) clear, meaningful and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed. and the logic involved;
2021/07/19
Committee: JURI
Amendment 829 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
(ca) whether the advertisement was selected using an automated mechanism. such as ad exchange mechanisms, and if so, the identity of the natural or legal person responsible for the system;
2021/07/19
Committee: JURI
Amendment 836 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider.
2021/07/08
Committee: IMCO
Amendment 840 #
Proposal for a regulation
Article 8 – paragraph 2 – point c a (new)
(ca) the order is issued only where no other effective means are available to bring about the cessation or the prohibition of the infringement
2021/07/08
Committee: IMCO
Amendment 841 #
Proposal for a regulation
Article 8 – paragraph 2 – point c b (new)
(cb) where more than one provider of intermediary services is responsible for hosting the specific item, the order is issued to the most appropriate provider that has the technical and operational ability to act against the specific item.
2021/07/08
Committee: IMCO
Amendment 842 #
Proposal for a regulation
Article 8 – paragraph 2 a (new)
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders.
2021/07/08
Committee: IMCO
Amendment 843 #
Proposal for a regulation
Article 8 – paragraph 2 b (new)
2b. Member States shall ensure that providers have a right to appeal and object to implementing the order and shall facilitate the use and access to that right.
2021/07/08
Committee: IMCO
Amendment 844 #
Proposal for a regulation
Article 8 – paragraph 2 c (new)
2c. When an order to act against a specific individual item of illegal content is issued by a relevant national judicial or administrative authority, Member States shall ensure that the relevant national judicial or administrative authority duly informs the Digital Services Coordinator from the Member State of the judicial or administrative authority.
2021/07/08
Committee: IMCO
Amendment 846 #
Proposal for a regulation
Article 8 – paragraph 3 – subparagraph 1 a (new)
Where upon receiving the copy of the order, at least three Digital Services Coordinators consider that the order violates Union or national law that is in conformity with Union law, including the Charter, they can object the enforcement of the order to the Board, based on a reasoned statement. Following recommendation of the Board, the Commission may decide whether the order is to be enforced.
2021/07/08
Committee: IMCO
Amendment 849 #
Proposal for a regulation
Article 8 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative law in conformity with Union law, including the Charter on Fundamental Rights. While acting in accordance with such laws, authorities shall not go beyond what is necessary in order to attain the objectives followed therein.
2021/07/08
Committee: IMCO
Amendment 856 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/19
Committee: JURI
Amendment 862 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
2021/07/08
Committee: IMCO
Amendment 864 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/19
Committee: JURI
Amendment 865 #
Proposal for a regulation
Article 9 – paragraph 1 a (new)
1a. If the provider cannot comply with the information order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that issued the information order
2021/07/08
Committee: IMCO
Amendment 869 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service and amplification of content that is in breach of their terms and conditions, including by means of inauthentic use, such as ‘deep fakes’ or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, democratic values, media freedom and freedom of expression of journalists, as well as their ability to verify facts, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/19
Committee: JURI
Amendment 870 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
— the identification of the issuing authority and the means to verify the authentication of the order;
2021/07/08
Committee: IMCO
Amendment 879 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider;
2021/07/08
Committee: IMCO
Amendment 882 #
Proposal for a regulation
Article 9 – paragraph 2 – point c a (new)
(ca) the order is issued only where no other effective means are available to receive the same specific item of information
2021/07/08
Committee: IMCO
Amendment 882 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/19
Committee: JURI
Amendment 884 #
Proposal for a regulation
Article 9 – paragraph 2 a (new)
2a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders. It shall ensure that the form meats the standards set down in the Annex of [XXX the regulation on European Production and Preservation Orders for electronic evidence in criminal matters].
2021/07/08
Committee: IMCO
Amendment 885 #
Proposal for a regulation
Article 9 – paragraph 2 b (new)
2b. When an order to provide a specific item of information about one or more specific individual recipients of the service is issued by a relevant national judicial or administrative authority, Member States shall ensure that the relevant national judicial or administrative authority duly informs the Digital Services Coordinator from the Member State of the judicial or administrative authority.
2021/07/08
Committee: IMCO
Amendment 885 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, design, the features or functioning of their services, or their terms and conditions;
2021/07/19
Committee: JURI
Amendment 887 #
Proposal for a regulation
Article 9 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative law in conformity with Union law.
2021/07/08
Committee: IMCO
Amendment 891 #
Proposal for a regulation
Chapter III – title
Due diligence oObligations for a transparent, accessible and safe online environment
2021/07/08
Committee: IMCO
Amendment 894 #
Proposal for a regulation
Article 9 a (new)
Article 9a Waiver 1. Providers of intermediary services may apply to the Commission for a waiver from the requirements of Chapter III, proved that they are: (a) non-for-profit or equivalent and serve a manifestly positive role in the public interest; (b) micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC; or (c) a medium enterprises within the meaning of the Annex to Recommendation 2003/361/EC without any systemic risk related to illegal content. The Providers shall present justified reasons for their request. 2. The Commission shall examination such an application and, after consulting the Board, may issue a waiver in whole or in parts to the requirements of this Chapter. 3. Upon the request of the Board or the provider, or on its own initiative, the Commission may review a waiver issued and revoke the waiver in whole or in parts. 4. The Commission shall maintain a list of all waivers issued and their conditions and shall publish this list to the public. (This amendment should be placed between the Chapter Title and the Section title)
2021/07/08
Committee: IMCO
Amendment 896 #
Proposal for a regulation
Article 9 a (new)
Article 9a Conflict between Union Acts 1. Where any obligation set down in this Regulation can be viewed as equivalent with or superseded by an obligation within another Union act, in which a provider of intermediary services is also a subject, a provider of intermediary services may apply to the Commission for a waiver from such requirements or a declaration that it should be deemed as having complied with this Regulation, in whole or in parts. The provider shall present justified reasons for their request. 2. The Commission shall examine such an application and, after consulting the Board, may issue a waiver or declaration in whole or in parts to the requirements of this Regulation. 3. Upon the request of the Board or on its own initiative, the Commission may review a waiver or declaration issued and revoke the waiver or declaration in whole or in parts. 4. The Commission shall maintain a list of all waiver and declaration issued and their conditions and shall publish this list to the public.
2021/07/08
Committee: IMCO
Amendment 899 #
Proposal for a regulation
Article 27 – paragraph 1 b (new)
1b. The Board shall evaluate the implementation and effectiveness of mitigating measures undertaken by very large online platforms listed in Article 27(1) and where necessary, may issue recommendations.
2021/07/19
Committee: JURI
Amendment 900 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which. The reports of the Board shall be broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. The reports shall include the following:
2021/07/19
Committee: JURI
Amendment 905 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
2a. Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision.
2021/07/08
Committee: IMCO
Amendment 921 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
5a. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC other than those which are either a very larger online platform or a marketplace.
2021/07/08
Committee: IMCO
Amendment 929 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions and on a designated web page that can be directly reached from the very large online platforms’ online interface, in a clear, accessible and easily comprehensible manner for the general public, the main parameters used in their recommender systems, the optimisation goals of their recommender systems as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 931 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used by the provider of the intermediary service for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible, machine-readable format.
2021/07/08
Committee: IMCO
Amendment 939 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective and proportionatenon- arbitrary manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter and, where applicable, any community or other standards created by recipients of the service.
2021/07/08
Committee: IMCO
Amendment 952 #
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
(ba) the natural or legal person or group who paid for the advertisement;
2021/07/19
Committee: JURI
Amendment 957 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. The very large online platform shall design and organise its online interface in such a way that recipients of the service can easily and efficiently exercise their rights under applicable Union law in relation to the processing of their personal data for each specific advertisement displayed to the data subject on the platform, in particular: (a) To withdraw consent or to object to processing (b) To obtain access to the personal data concerning the data subject (c) To obtain rectification of inaccurate personal data concerning the data subject (d) To obtain erasure of personal data without undue delay (e) Where a recipient exercises any of these rights, the online platform must inform any parties to whom the personal data concerned in points (a)-(d) have been enclosed in accordance with Article 19 of Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 960 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2b. Very large online platforms shall be prohibited from profiling or targeting minors with personalised advertising, in compliance with the industry-standards laid down in Article 34 and Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 962 #
Proposal for a regulation
Article 30 – paragraph 2 c (new)
2c. Very large online platforms shall take adequate measures to detect inauthentic videos (‘deep fakes’). When detecting such videos, they should label them as inauthentic in a way that is clearly visible for the internet user.
2021/07/19
Committee: JURI
Amendment 963 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall not require recipients of the service other than traders to make their legal identity public in order to use the service.
2021/07/08
Committee: IMCO
Amendment 963 #
Proposal for a regulation
Article 30 – paragraph 2 d (new)
2d. Very large online platforms shall offer users the opportunity to check if their username and password have been compromised in a data leak, such as through the pwned open source database.
2021/07/19
Committee: JURI
Amendment 964 #
Proposal for a regulation
Article 12 – paragraph 2 d (new)
2d. For providers other than very large online platforms, nothing in this Regulation shall prevent a provider of intermediary services provider concerned from terminating the contractual relationship with its recipients without clause, in the situations provided for in the terms and conditions. Providers of a very large online platform shall issue a statement for the termination to the recipient, and the recipient shall have access to the internal complaint mechanism under Article 17 and the out- of-court mechanism under Article 18.
2021/07/08
Committee: IMCO
Amendment 966 #
Proposal for a regulation
Article 12 a (new)
Article 12a General Risk Assessment and Mitigation Measures 1. Providers of intermediary services shall identify, analyse and assess, at least once and at each significant revision of a service thereafter, the potential misuse or other risks stemming from the functioning and use made of their services in the Union. Such a general risk assessment shall be specific to their services and shall include at least risks related to the dissemination of illegal content through their services and any contents that might have a negative effect on potential recipients of the service, especially minors and gender equality. 2. Providers of intermediary services which identify potential risks shall, wherever possible, attempt to put in place reasonable, proportionate and effective mitigation measures in line with their terms and conditions. 3. Where the identified risk relates to minors, without regard to if the child is acting with respect to the terms and conditions, mitigation measures shall include, taking into account the industry standards referred to in Article 34, where needed and applicable: (a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure those prioritise the best interests of the child; (b) adapting or removing system design features that expose or promote to children to content, contact, conduct and contract risks; (c) ensuring the highest levels of privacy, safety, and security by design and default for children including any profiling or use of data for commercial purposes; (d) if a service is targeted at children, provide child-friendly mechanisms for remedy and redress, including easy access to expert advice and support. 4. Providers of intermediary services shall, upon request, explain to the Digital Services Coordinator of the Member State of establishment, how it undertook this risk assessment and what voluntary mitigation measures it undertook.
2021/07/08
Committee: IMCO
Amendment 972 #
Proposal for a regulation
Article 12 b (new)
Article 12b Fair consent choice screens 1. Providers of intermediary services that ask the recipients of their service for consent as required by Regulation (EU) 2016/679 to collect or process personal data concerning them shall ensure that the end user choice screens shown to that end are designed in a fair and neutral manner and do not in any way subvert or impair user autonomy, decision-making, or choice via the choice screens’ structure, function or manner of operation. In particular, providers shall refrain from: (a) giving more visual prominence to any of the consent options when asking the recipient of the service for a decision; (b) repeatedly requesting that a recipient of the service consents to data processing, regardless of the scope of purpose of such processing, especially by presenting a pop-up that interferes with user experience; (c) urging a recipient of the service to change any setting or configuration of the service after the person in question has already made her choice, including by the use of a technical standard in accordance with paragraph 3; (d) making the procedure of cancelling a service more cumbersome then signing up to it. 2. The Commission may adopt implementing acts to prescribe binding design aspects and functions of consent choice screens that fulfil the requirements of paragraph 1. 3. Providers of intermediary services shall accept the communication of consent choices made by the recipient of the service through automated means, including through standardised digital signals sent by the recipient’s software used to access the service such as web browsers and operating systems. 4. The Commission shall promote and facilitate the development of technical standards for the automated communication of consent choices through international and Union standardisation bodies. Where standardisation bodies fail to develop a workable technical standard, the Commission shall, not later than two years after entry into force of this Regulation, designate a binding technical standard for the purpose of paragraph 3.
2021/07/08
Committee: IMCO
Amendment 981 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed to inform taking the action specified in thoshe authority issuing the order of its receipt and the effect given to the orders;
2021/07/08
Committee: IMCO
Amendment 983 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; Providers of intermediary services may add additional information as to the reasons for the average time for taking the action.
2021/07/08
Committee: IMCO
Amendment 1003 #
Proposal for a regulation
Article 33 – paragraph 2 a (new)
2a. The reports shall include content moderation broken down per Member State in which the services are offered and in the Union as a whole and shall be published in the official languages of the Member States of the Union.
2021/07/19
Committee: JURI
Amendment 1005 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which are not very large online platforms in accordance with Article 25.
2021/07/08
Committee: IMCO
Amendment 1010 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
2a. Where made available to the public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1010 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1a. The Commission shall support and promote the development and implementation of standards set by relevant European and international standardisation bodies, subject to transparent, multi-stakeholder and inclusive processes in line with Regulation (EU) 1025/2012, for the protection and promotion of the rights of the child, observance of which, once adopted will be mandatory for very large online platforms, at least for the following: (a) Age assurance and age verification; (b) Child impact assessments; (c) Child-centred and age-appropriate design; (d) Child-centred and age-appropriate terms and conditions.
2021/07/19
Committee: JURI
Amendment 1023 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or non-governmental entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means and may include: (a) a clearly identifiable banner or single reporting button, allowing the users of those services to notify quickly and easily the providers of hosting services; (b) providing information to the users on what is considered illegal content under Union and national law; (c) providing information to the users on available national public tools to signal illegal content to the competent authorities in Member States were the service is directed.
2021/07/08
Committee: IMCO
Amendment 1042 #
Proposal for a regulation
Article 36 a (new)
Article 36a Codes of conduct for the protection of minors 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms and other relevant services providers and organisations representing minors, parents and civil society organisations or relevant authorities to further contribute to the protection of minors on online. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective protection of minors online, which respects their right as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment. The Commission shall aim to ensure that the codes of conduct address at least: (a) Age verification and age assurance models, taking into account the industry standards referred to in article 34. (b) Child-centred and age-appropriate design, taking into account the industry standards referred to in article 34. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.
2021/07/19
Committee: JURI
Amendment 1052 #
Proposal for a regulation
Article 14 – paragraph 2 – point d
(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete to the best available knowledge.
2021/07/08
Committee: IMCO
Amendment 1057 #
Proposal for a regulation
Article 14 – paragraph 3
3. NAdequately substantiated notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Arn obligation to investigate the notice in an effective and timely manner. If a provider is unable to determine if a noticle 5 in respect of the specific item of information concernedis valid, a provider may ask the Digital Service Coordinator or other national administrative bodies for an opinion before removing or disabling the content.
2021/07/08
Committee: IMCO
Amendment 1075 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-discriminatory and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
2021/07/08
Committee: IMCO
Amendment 1088 #
Proposal for a regulation
Article 45 – paragraph 4
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of anythe result of the investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation. The Digital Services Coordinator shall at least conduct a preliminary assessment of the issue raised.
2021/07/19
Committee: JURI
Amendment 1093 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or, disable access to or otherwise restrict the visibility of specific items of information provided by the recipients of the service or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or, removing or disabling access to or reducing the visibility of that information and of the reason for its decision, it shall inform the recipient on a durable medium, at the latest at the time of the removal or disabling of access or the restriction of visibility or the suspension or termination of monetization, of the decision and provide a clear and specific statement of reasons for that decision.
2021/07/08
Committee: IMCO
Amendment 1104 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the restriction of the visibility of, or the demonetisation of, the information and, where relevant, the territorial scope of the disabling of access; or the restriction;
2021/07/08
Committee: IMCO
Amendment 1119 #
Proposal for a regulation
Article 15 – paragraph 4
4. Providers of hosting services shall publish at least annually the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal data.
2021/07/08
Committee: IMCO
Amendment 1121 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
4a. Paragraph1 shall not apply where: - a provider of hosting service does not have the information necessary to inform the recipient by a durable medium; - a provider of hosting service has already informed the recipient of the removal or disabling of the same or similar items of information from the same recipient; - content is manifestly illegal; - content is deceptive, high-volume commercial content; or - requested by a judicial or law enforcement authority to not inform the recipient due to an ongoing criminal investigations until the criminal investigations is closed.
2021/07/08
Committee: IMCO
Amendment 1131 #
Proposal for a regulation
Article 15 a (new)
Article 15a Alternative mechanisms based on an adequacy decision 1. Where a platform has an existing alternative notice and action mechanisms as set down by the law of a third country or in accordance with other Union law, upon a request by a provider, the Commission may issue a decision that declare these mechanisms as ensuring an adequate level of protection and fulfilling the requirements in Article 14 and Article 15. Before issues any such decision, the Commission shall consult the Board and the general public at least one month before the decision is adopted.
2021/07/08
Committee: IMCO
Amendment 1141 #
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which are not very large online platforms in accordance with Article 25.
2021/07/08
Committee: IMCO
Amendment 1154 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable access to or restrict the visibility of the information;
2021/07/08
Committee: IMCO
Amendment 1169 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to restrict the ability to monetize content provided by the recipients.
2021/07/08
Committee: IMCO
Amendment 1195 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, that would negatively affect them and that are referred to in paragraph 4, are not solely taken on the basis of automated means.
2021/07/08
Committee: IMCO
Amendment 1201 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 and established in the Member State of the provider or the Member State of the recipient, in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
2021/07/08
Committee: IMCO
Amendment 1207 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
1a. Where a recipient seeks a resolved to multiple complaints, either party may request that the out-of-court dispute settlement body treats and resolves these complaints in a single dispute decision.
2021/07/08
Committee: IMCO
Amendment 1225 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
2021/07/08
Committee: IMCO
Amendment 1234 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
(d) it is capable of settling dispute in a swift, efficient, accessible for persons with disabilities and cost-effective manner and in at least one official language of the Union;
2021/07/08
Committee: IMCO
Amendment 1248 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, and the body does not find the recipient acted in bad faith in the dispute, the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
2021/07/08
Committee: IMCO
Amendment 1255 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
6a. This Article shall only take effect on providers other than very large online platforms from [24 months after the date of entry into force of this Regulation].
2021/07/08
Committee: IMCO
Amendment 1274 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform, law enforcement, or other government or relevant commercial entity;
2021/07/08
Committee: IMCO
Amendment 1285 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
(ca) it has a transparent funding structure, including publishing the sources and amounts of all revenue annually
2021/07/08
Committee: IMCO
Amendment 1286 #
Proposal for a regulation
Article 19 – paragraph 2 – point c b (new)
(cb) it is not already a trusted flagger in another Member State.
2021/07/08
Committee: IMCO
Amendment 1287 #
Proposal for a regulation
Article 19 – paragraph 2 – point c c (new)
(cc) it publishes, at least once a year, clear, easily comprehensible and detailed reports on any notices submitted in accordance with Article 14 during the relevant period. The report shall list notices categorised by the identity of the hosting service provider, the type of alleged illegal or terms and conditions violating content concerned, and what action was taken by the provider. In addition, the report shall identify relationships between the trusted flagger and any online platform, law enforcement, or other government or relevant commercial entity, and explain the means by which the trusted flagger maintains its independence.
2021/07/08
Committee: IMCO
Amendment 1288 #
Proposal for a regulation
Article 19 – paragraph 2 – subparagraph 1 a (new)
By way of derogation from point (b), a public entity may be awarded with the status of trusted flagger for non- intellectual property right related actions.
2021/07/08
Committee: IMCO
Amendment 1289 #
Proposal for a regulation
Article 19 – paragraph 2 a (new)
2a. Online platforms may treat other third parties considered by the provider to have particular expertise and responsibilities for the purposes of tackling illegal content online as equal to a trusted flagger as to the mechanisms referred to Article 14. The conditions for granting such treatment shall be clearly set out and objective and shall be communicated to the Digital Services Coordinator of establishment. The names of such third parties shall be published in a clear and easily findable manner.
2021/07/08
Committee: IMCO
Amendment 1303 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. During this period of investigation by the Digital Services Coordinator, the trusted flagger shall be treated as a non-trusted flagger when using the mechanisms referred to in Article 14, where not suspended under Article 20.
2021/07/08
Committee: IMCO
Amendment 1309 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received byfrom third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
2021/07/08
Committee: IMCO
Amendment 1317 #
Proposal for a regulation
Article 19 a (new)
Article 19a Accessibility requirements for online platforms 1. Providers of online platforms which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882. 2. Providers of online platforms shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in written and oral format, including in a manner which is accessible to persons with disabilities. Providers of online platforms shall keep that information for as long as the service is in operation. 3. Providers of online platforms shall ensure that information, forms and measures provided pursuant to this Regulation are made available in a manner that they are easy to find and accessible to persons with disabilities. 4. Providers of online platforms which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services. 5. In the case of non-conformity, providers of online platforms shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. 6. Provider of online platforms shall, further to a reasoned request from a competent authority, provide it with all information necessary to demonstrate the conformity of the service with the applicable accessibility requirements. They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements. 7. Online platforms which are in conformity with harmonised standards or parts thereof the references of which have been published in the Official Journal of the European Union shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those standards or parts thereof cover those requirements. 8. Online platforms which are in conformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements.
2021/07/08
Committee: IMCO
Amendment 1322 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and where proportionate after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
2021/07/08
Committee: IMCO
Amendment 1338 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
(d) the intention of the recipient, individual, entity or complainant., including whether submissions were made in bad faith;
2021/07/08
Committee: IMCO
Amendment 1340 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
(da) whether a notice was submitted by an individual user or by an entity or persons with specific expertise related to the content in question;
2021/07/08
Committee: IMCO
Amendment 1344 #
Proposal for a regulation
Article 20 – paragraph 3 – point d b (new)
(db) the manner of how notices have been submitted, including by automated means.
2021/07/08
Committee: IMCO
Amendment 1345 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
3a. Suspensions referred to in paragraphs 1 and 2 may be declared permanent where (a) compelling reasons of law or public policy, including ongoing criminal investigations, justify avoiding or postponing notice to the recipient; (b) the items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts; or (c) the items removed were related to content covered by [Directive 2011/93/EU updated reference] or [Directive (EU) 2017/541 or Regulation (EU) 2021/784 of the European Parliament and of the Council.
2021/07/08
Committee: IMCO
Amendment 1348 #
Proposal for a regulation
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regardexamples as the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
2021/07/08
Committee: IMCO
Amendment 1355 #
Proposal for a regulation
Article 21 – paragraph 1
1. Where an online platform becomes aware of anyexact information giving rise to a suspicion that a serious criminal offence involving an imminent threat to the life or safety of persons has taken place, is taking place or is likelyplanned to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide al, upon their request, any additional relevant information available.
2021/07/08
Committee: IMCO
Amendment 1363 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
2a. Unless instructed otherwise by the informed authority, the provider shall remove or disable the content. It shall store all content and related data for at least six months.
2021/07/08
Committee: IMCO
Amendment 1364 #
Proposal for a regulation
Article 21 – paragraph 2 b (new)
2b. Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified.
2021/07/08
Committee: IMCO
Amendment 1365 #
Proposal for a regulation
Article 21 – paragraph 2 c (new)
2c. The Commission shall adopt an implementing act setting down a template for notifications under paragraph 1.
2021/07/08
Committee: IMCO
Amendment 1366 #
Proposal for a regulation
Article 21 – paragraph 2 d (new)
2d. Where a notification of suspicions of criminal offences includes information which may be seen as potential electronic information in criminal proceedings, Regulation XXX [E-evidence] shall apply.
2021/07/08
Committee: IMCO
Amendment 1368 #
Proposal for a regulation
Article 22 – title
Traceability of traders on online Marketplaces
2021/07/08
Committee: IMCO
Amendment 1375 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, itProviders of online marketplaces shall ensure that traders can only use itstheir services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of itstheir services for those purposes, the online platformmarketplace has obtained the following information from traders, where applicable:
2021/07/08
Committee: IMCO
Amendment 1385 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bankpayment account details of the trader, where the trader is a natural person;
2021/07/08
Committee: IMCO
Amendment 1389 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of established in the Union and carrying out the tasks in accordance with Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or [Article XX of the General Product Safety Regulation] or any relevant act of Union law; __________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
2021/07/08
Committee: IMCO
Amendment 1400 #
Proposal for a regulation
Article 22 – paragraph 1 a (new)
1a. Providers of online marketplaces shall require traders to provide the information referred to in points (a) and (e) immediately upon initial registration for its services. Traders shall be required to provide any supplementary material relating to the information requirements set out in Article 22(1) within a reasonable period, no later than before offering of products and services to consumer.
2021/07/08
Committee: IMCO
Amendment 1406 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platformproviders of online marketplaces shall, upon receiving that information, make reasonablebest efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliablaccurate through the use of any freely accessible official online database or online interface made available by an authorised administrator or a Member States or the Union or through direct requests to the trader to provide supporting documents from reliable sources.
2021/07/08
Committee: IMCO
Amendment 1416 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Where the online platform obtainsproviders of online marketplaces obtains sufficient indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platformmarketplace shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/07/08
Committee: IMCO
Amendment 1419 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2
Where the trader fails to correct or complete that information, the online platformmarketplace shall suspend the provision of its service to the trader in relations to the offering of products or services to consumers located in the Union until the request is fully complied with.
2021/07/08
Committee: IMCO
Amendment 1425 #
Proposal for a regulation
Article 22 – paragraph 3 a (new)
3a. The providers of online marketplaces shall ensure that traders are given the ability to discuss any information viewed as inaccurate or incomplete directly with a trader before any suspension of services. This may take the form of the internal complaint- handling system under Article 17.
2021/07/08
Committee: IMCO
Amendment 1427 #
Proposal for a regulation
Article 22 – paragraph 3 b (new)
3b. If an online marketplace rejects an application for services or suspends services to a trader, the trader shall have recourse to the systems under Article 17 and Article 43 of this Regulation.
2021/07/08
Committee: IMCO
Amendment 1429 #
Proposal for a regulation
Article 22 – paragraph 3 c (new)
3c. Traders shall be solely liable for the accuracy of the information provided and shall inform without delay the online marketplace of any changes to the information provided.
2021/07/08
Committee: IMCO
Amendment 1432 #
Proposal for a regulation
Article 22 – paragraph 4
4. The online platformmarketplace shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information no later than six months after the final conclusion of a distance contract.
2021/07/08
Committee: IMCO
Amendment 1448 #
Proposal for a regulation
Article 22 – paragraph 7
7. The online platform shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law.deleted
2021/07/08
Committee: IMCO
Amendment 1462 #
Proposal for a regulation
Article 22 a (new)
Article 22a Compliance by design 1. Providers of online marketplaces shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law. 2. The online interface shall allow traders to provide at least the information necessary for the unequivocal identification of the products or the services offered, and, where applicable, the information concerning the labelling in compliance with rules of applicable Union law on product safety and product compliance. 3. This Article is without prejudice to additional requirements under other Union acts, including the [General Product Safety Regulation] and [Market Surveillance Regulation]
2021/07/08
Committee: IMCO
Amendment 1466 #
Proposal for a regulation
Article 22 b (new)
Article 22b Right to information 1. Where a provider of an online marketplace becomes aware, irrespective of the means used to, of the illegal nature of a product or service offered through its services, it shall inform, wherever possible, those recipients of the service that had acquired such product or contracted such service during the last six months about the illegality, the identity of the trader and any means of redress. 2. Where the provider of the online marketplace does not have the contact details of the recipients of the service referred to in paragraph 1, the provider shall make publicly available and easily accessible on their online interface the information concerning the illegal products or services removed, the identity of the trader and any means of redress.
2021/07/08
Committee: IMCO
Amendment 1480 #
Proposal for a regulation
Article 23 – paragraph 4 a (new)
4a. Where published to the general public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1506 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Without prejudice to other Union acts, online platforms that display user- generated content that may include sponsored information or other information equivalent to advertising, which is normally provided against remuneration, shall including in their terms and conditions an obligation for the recipients of their service to inform other recipients of when they have received remuneration or any other goods in kind for their content. A failure to inform the platform or other recipients shall constitute a violation of the provider’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1537 #
Proposal for a regulation
Article 25 – paragraph 3 – subparagraph 1 a (new)
Such a methodology shall ensure the following in relations to active recipients: (1) automated interactions, accounts or data scans by a non-human (“bots”) are not included; (2) that the mere viewing of a service without purchase, logging in or otherwise active identification of a recipient shall not be seen as an active recipient; (3) that the number shall be based on each service individually; (4) that recipients connected on multiple devices are counted only once; (5) that indirect use of service, via a third party or linking, shall not be counted; (6) where an online platform is hosted by another provider of intermediary services, that the active recipients are assigned solely to the online platform closest to the recipient; (7) the average number is maintained for a period of at least six months.
2021/07/08
Committee: IMCO
Amendment 1556 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a) the dissemination of illegal content and content that is in breach of their terms and conditions through their services;,
2021/07/08
Committee: IMCO
Amendment 1565 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1576 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service and amplification of content that is in breach of their terms and conditions, including by means of inauthentic use, or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/08
Committee: IMCO
Amendment 1590 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how and whether their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1602 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measureseasures to mitigate the probability and severity of any, tailored to address the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1612 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, design, the features or functioning of their services, or their terms and conditions;
2021/07/08
Committee: IMCO
Amendment 1613 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of and targeting of advertisements in association with the service they provide or the alternative placement and display of public service advertisements or other related factual information;
2021/07/08
Committee: IMCO
Amendment 1625 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Very large online platforms shall, where appropriate, conduct their risk assessments referred in Article 26 and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. Where no such involvement is taken, this shall be made clear in the transparency report referred to in Article 33.
2021/07/08
Committee: IMCO
Amendment 1641 #
Proposal for a regulation
Article 27 – paragraph 2 – subparagraph 1 a (new)
The reports of the Board shall include information both broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union.
2021/07/08
Committee: IMCO
Amendment 1649 #
Proposal for a regulation
Article 27 – paragraph 3 a (new)
3a. The requirement to put in place mitigation measures shall not require an obligation to impose general monitoring or active fact-finding obligations.
2021/07/08
Committee: IMCO
Amendment 1653 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
2021/07/08
Committee: IMCO
Amendment 1664 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which have been selected by the Commission and:
2021/07/08
Committee: IMCO
Amendment 1677 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
(fa) a description of specific elements that could not be audited, and an explanation of why these could not be audited;
2021/07/08
Committee: IMCO
Amendment 1679 #
Proposal for a regulation
Article 28 – paragraph 3 – point f b (new)
(fb) where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such conclusion.
2021/07/08
Committee: IMCO
Amendment 1685 #
Proposal for a regulation
Article 28 – paragraph 4 b (new)
4b. Where an audit report contains information that could be misused in order to harm the security and privacy of receptions of the platform, the very large online platform may request from the Commission that such information is removed or summarised in any public version of the audit report. The Commission shall consider any such requests and may grant such a request if deemed merited.
2021/07/08
Committee: IMCO
Amendment 1691 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions and on a designated web page that can be directly reached and easily found from the very large online platforms’ online interface, in a clear, accessible and easily comprehensible manner for the general public, the main parameters used in their recommender systems, the optimisation goals of their recommender systems as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1710 #
Proposal for a regulation
Article 30 – title
Additional transparency for online advertising transparencyand "deep fakes" audiovisual media
2021/07/08
Committee: IMCO
Amendment 1711 #
Proposal for a regulation
Article 30 – title
Additional online advertising transparency and protection
2021/07/08
Committee: IMCO
Amendment 1725 #
Proposal for a regulation
Article 30 – paragraph 2 – point c a (new)
(ca) the natural or legal person or group who paid for the advertisement;
2021/07/08
Committee: IMCO
Amendment 1738 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. Very large online platforms shall be prohibited from profiling children under the age of 16 for commercial practices, including personalized advertising, in compliance with industry- standards laid down in Article 34 and Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1740 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. The Board shall, after consulting with trusted flaggers and vetted researchers, publish guidelines on the structure and organisation of repositories created pursuant to paragraph 1.
2021/07/08
Committee: IMCO
Amendment 1743 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2b. Where a very large online platform becomes aware that a piece of content is a deep fake, the provider shall label the content in a way that informs that the content is inauthentic and that is clearly visible for the recipient of the services.
2021/07/08
Committee: IMCO
Amendment 1745 #
Proposal for a regulation
Article 30 – paragraph 2 c (new)
2c. The very large online platform shall design and organise its online interface in such a way that recipients of the service can easily and efficiently exercise their rights under applicable Union law in relation to the processing of their data for each specific advertisement displayed to the data subject on the platform, in particular: (a) to withdraw consent or to object to processing; (b) to obtain access to the data concerning the data subject; (c) to obtain rectification of inaccurate data concerning the data subject; (d) to obtain erasure of data without undue delay. Where a recipient exercises any of these rights, the online platform must inform any parties to whom the personal data concerned in points (a) to (d) have been enclosed.
2021/07/08
Committee: IMCO
Amendment 1748 #
Proposal for a regulation
Article 30 – paragraph 2 d (new)
2d. Where a recipient exercises any of the rights referred to points (a), (c) or(d) in paragraph 2c, the online platform must without undue delay cease displaying advertisements using the personal data concerned or using parameters which were set using this data.
2021/07/08
Committee: IMCO
Amendment 1749 #
Proposal for a regulation
Article 30 – paragraph 2 e (new)
2e. Very large online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request and obtain information on where their advertisements have been placed; (b) can request and obtain information on which broker treated their data;
2021/07/08
Committee: IMCO
Amendment 1761 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, researchers shall: (1) be affiliated with academic institutions, be independent from commercial interests, within the Union and the institutions certifies that the researcher is a researcher in good standing (2) be independent from commercial interests, including any very large online platforms (3) be independent from any government, administrative or other state bodies, outside the academic institution of affiliation if public, (4) have undergone an independent background and security investigation, subject to the national legislation of the Member State of residence. (5) be a resident of the Union; (6) have proven records of expertise in the fields related to the risks investigated or related research methodologies, and (7) shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
2021/07/08
Committee: IMCO
Amendment 1768 #
Proposal for a regulation
Article 31 – paragraph 4 a (new)
4a. Where a very large online platform or a Digital Services Coordinator has grounds to believe that a researcher is acting outside the purpose of paragraph 2 or no longer respects the conditions of paragraph 4, access to data shall be withdrawn and the Digital Services Coordinator of establishment shall decide if and when access shall be restored and under what conditions.
2021/07/08
Committee: IMCO
Amendment 1787 #
Proposal for a regulation
Article 31 – paragraph 7 a (new)
7a. Digital Service Coordinators and the Commission shall, once a year, report the following information: (a) the number of requests made to them as referred to in paragraphs 1 and 2; (b) the number of such requests that have been declined or withdrawn by the Digital Service Coordinator or the Commission and the reasons for which they have been declined or withdrawn, including following a request to the Digital Service Coordinator or the Commission from a very large online platform to amend a request as referred to in paragraphs 1 and 2.
2021/07/08
Committee: IMCO
Amendment 1799 #
Proposal for a regulation
Article 33 – paragraph 1 – subparagraph 1 a (new)
Such reports shall include content moderation information separated and presented for each Member State in which the services are offered and for the Union as a whole. The reports shall be published in at least one of the official languages of the Member States of the Union in which services are offered.
2021/07/08
Committee: IMCO
Amendment 1827 #
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
(fa) accessibility of elements and functions of online platforms and digital services for persons with disabilities aiming at consistency and coherence with existing harmonised accessibility requirements when these elements and functions are not already covered by existing harmonised European standards
2021/07/08
Committee: IMCO
Amendment 1835 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child, observance of which, once adopted will be mandatory for very large online platforms, at least for the following: (a) age assurance and age verification; (b) child impact assessments; (c) child-centred and age-appropriate design; (d) child-centred and age-appropriate terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1840 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
2a. Where any of the standards under paragraph 1 have not been adopted by [24 months of the entry into force of this regulation], the Commission may adopt a delegated act in accordance with Article 69 to set down rules, guidelines or a template for the harmonised application of the applicable articles. Once a standard has been established, the Commission shall cease work on or withdraw its delegated act if already adopted.
2021/07/08
Committee: IMCO
Amendment 1856 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission mayshall invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/08
Committee: IMCO
Amendment 1874 #
Proposal for a regulation
Article 35 – paragraph 5
5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic and repetitive failure to comply with the Codes of Conduct, the Board shall as a measure of last resort take a decision to temporary suspend or definitely exclude platforms that do not meet their commitments as a signatory to the Codes of Conduct.
2021/07/08
Committee: IMCO
Amendment 1882 #
Proposal for a regulation
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency infor all actors in the online advertising value chain, beyond the requirements of Articles 24 and 30.
2021/07/08
Committee: IMCO
Amendment 1889 #
Proposal for a regulation
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall evaluate the application of those codes three years after the application of this Regulation.
2021/07/08
Committee: IMCO
Amendment 1892 #
Proposal for a regulation
Article 36 – paragraph 3 a (new)
3a. The Commission shall encourage all the actors in the online advertising eco-system to endorse and comply with the commitments stated in the codes of conduct.
2021/07/08
Committee: IMCO
Amendment 1893 #
Proposal for a regulation
Article 36 a (new)
Article 36a Codes of conduct for the protection of minors 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms and other relevant services providers and organisations representing minors, parents and civil society organisations or relevant authorities to further contribute to the protection of minors on online. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective protection of minors online, which respects their right as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment. The Commission shall aim to ensure that the codes of conduct address at least: (a) age verification and age assurance models, taking into account the industry standards referred to in article 34. (b) child-centred and age-appropriate design, taking into account the industry standards referred to in Article 34. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of the Regulation and their application no later than six months after that date.
2021/07/08
Committee: IMCO
Amendment 1935 #
Proposal for a regulation
Article 40 – paragraph 3 a (new)
3a. Paragraph 3 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which are not very large online platforms. Such enterprises shall be deemed to be under the jurisdiction of the Member State where their point of contact resides or is established. Where no point of contract is established or resides in a Member State, paragraph 3 shall apply.
2021/07/08
Committee: IMCO
Amendment 1963 #
Proposal for a regulation
Article 42 a (new)
Article 42a General conditions for imposing penalties 1. Before penalties are issued under Article 42, when deciding whether to impose a penalty and deciding on the amount of the penalty in each individual case due regard shall be given to the following: (a) the nature, gravity and duration of the infringement taking into account the nature scope or purpose of the processing concerned as well as the number of recipients affected and the level of damage suffered by them; (b) the intentional or negligent character of the infringement; (c) any action taken by the provider to mitigate the damage of the infringement; (d) the degree of responsibility of the provider taking into account any other providers involved; (e) any relevant previous infringements by the provider; (f) the degree of cooperation with the Digital Services Coordinator(s),in order to remedy the infringement and mitigate the possible adverse effects of the infringement; (g) the manner in which the infringement became known to the Member State; (h) where infringement have previously been ordered against the provider concerned with regard to the same subject-matter, compliance with those measures; (i) adherence to approved codes of conduct pursuant to Article35 and 36; and (k) any other aggravating or mitigating factor applicable to the circumstances of the case, such as financial benefits gained, or losses avoided, directly or indirectly, from the infringement. 2. If a provider infringes several provisions of this Regulation, the total amount of the penalty shall not exceed the amount specified in Article 42 (3). 3. The exercise by a Member State of its powers under this Article and Article 42 shall be subject to appropriate procedural safeguards in accordance with Union and Member State law, including effective judicial remedy and due process.
2021/07/08
Committee: IMCO
Amendment 2030 #
Proposal for a regulation
Article 47 – paragraph 1
1. An independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary services named ‘European Board for Digital Services’ (the ‘Board’) is established and shall have legal personality.
2021/07/08
Committee: IMCO
Amendment 2049 #
Proposal for a regulation
Article 48 – paragraph 1
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shallmay participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. Member State has more than one representative present, solely the final word of the Digital Services Coordinator shall be taken as the position of the Member State in question.
2021/07/08
Committee: IMCO
Amendment 2054 #
Proposal for a regulation
Article 48 – paragraph 2 – subparagraph 1 a (new)
Where a Member State has more than one representative present, solely the Digital Services Coordinator shall be able to vote.
2021/07/08
Committee: IMCO
Amendment 2067 #
Proposal for a regulation
Article 48 – paragraph 5 a (new)
5a. The Board shall, where appropriate, consult interested parties and give them the opportunity to comment within a reasonable period. The Board shall make the results of the consultation procedure publicly available.
2021/07/08
Committee: IMCO
Amendment 2070 #
Proposal for a regulation
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure by a two-thirds majority of its members, following the consent of the Commission.
2021/07/08
Committee: IMCO
Amendment 2092 #
Proposal for a regulation
Article 49 a (new)
Article 49a Reports 1. The Board shall draw up an annual report regarding its actions. The report shall be made public and be transmitted to the European Parliament, to the Council and to the Commission in all official languages of the Member States. 2. The annual report shall include, among other information, a review of the practical application of the opinions, guidelines, recommendations advice and any other measures taken under Article 49(1).
2021/07/08
Committee: IMCO
Amendment 2140 #
Proposal for a regulation
Article 51 a (new)
Article 51a Requirements for the Commission 1. The Commission shall perform its tasks under this Regulation in an impartial, transparent and timely manner. The Commission shall ensure that its units given responsibility for this regulation have the adequate technical, financial and human resources to carry out their tasks. 2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Commission shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party.
2021/07/08
Committee: IMCO
Amendment 2145 #
Proposal for a regulation
Article 52 – paragraph 2
2. When sending a simple request for information to the very large online platform concerned or other person referred to in Article 52(1), the Commission shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which the information is to be provided, and the penalties provided for in Article 59 for supplying incorrect or misleading information. The purpose shall include reasoning on why and how the information is necessary, proportionality to the purpose and cannot be received by other means.
2021/07/08
Committee: IMCO
Amendment 2150 #
Proposal for a regulation
Article 52 – paragraph 4
4. The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). Lawyers duly authorised to act may supply the information on behalf of their clients. The latter shall remain fully responsible if the information supplied is incomplete, incorrect or misleading.
2021/07/08
Committee: IMCO
Amendment 2223 #
Proposal for a regulation
Article 59 – paragraph 4
4. In fixing the amount of the fine, the Commission shall have regard to the nature, gravity, duration and recurrence of the infringement, any fines issued under Article 42 and need to avoid double sanctioning the same infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings.
2021/07/08
Committee: IMCO
Amendment 2282 #
Proposal for a regulation
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 23, 25, 31 and 314 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
2021/07/08
Committee: IMCO
Amendment 2284 #
Proposal for a regulation
Article 69 – paragraph 3
3. The delegation of power referred to in Articles 23, 25, 31 and 314 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
2021/07/08
Committee: IMCO
Amendment 2292 #
Proposal for a regulation
Article 73 – paragraph 4 a (new)
4a. By three years from the date of application of this Regulation at the latest, the Commission shall carry out an assessment of any impact of the costs to European service providers of any similar requirements, including those of Article 11, introduced by third-party states and any new barriers to non-EU market access after the adoption of this Regulation. The Commission shall also access the impact on the ability of European businesses and consumers to access and buy products and services from outside the Union.
2021/07/08
Committee: IMCO
Amendment 2293 #
Proposal for a regulation
Article 74 – paragraph 1 a (new)
1a. Chapter III, section 4 shall apply from [date - 3 months after its entry into force].
2021/07/08
Committee: IMCO
Amendment 2294 #
Proposal for a regulation
Article 74 – paragraph 2
2. ItThis Regulation, with the exception of Chapter III section 4, shall apply from [date - threwelve months after its entry into force].
2021/07/08
Committee: IMCO