BETA

434 Amendments of Stéphane SÉJOURNÉ related to 2020/0361(COD)

Amendment 100 #
Proposal for a regulation
Recital 4 a (new)
(4 a) As Party to the United Nations Convention on the Rights of Persons with Disabilities (UN CRPD), provisions of the Convention are integral part of the Union legal order and binding upon the Union and its Member States. The UN CRPD requires its Parties to take appropriate measures to ensure that persons with disabilities have access, on an equal basis with others, to information and communications technologies and systems, and other facilities and services open or provided to the public, both in urban and in rural areas. General Comment No2 to the UN CRPD further states that “The strict application of universal design to all new goods, products, facilities, technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity.”Given the ever-growing importance of digital services and platforms in private and public life, in line with the obligations enshrined in the UN CRPD, the EU must ensure a regulatory framework for digital services which protects rights of all recipients of services, including persons with disabilities.
2021/07/20
Committee: JURI
Amendment 101 #
Proposal for a regulation
Recital 5 a (new)
(5 a) Given the cross-border nature of the services at stake, EU action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and ensure that equal right to access and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for intermediary services, including their user interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals.
2021/07/20
Committee: JURI
Amendment 102 #
Proposal for a regulation
Recital 5 b (new)
(5 b) The notions of ‘access’ or ‘accessibility’ are often referred to with the meaning of affordability (financial access), availability, or in relation to access to data, use of network, etc. It is important to distinguish these from ‘accessibility for persons with disabilities’ which means that services, technologies and products are perceivable, operable, understandable and robust for persons with disabilities.
2021/07/20
Committee: JURI
Amendment 105 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, among others, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. To assist Member States and providers, the Commission should provide guidelines as to how to interpret the interaction between different Union acts and how to prevent any duplication of requirements on providers or potential conflicts in the interpretation of similar requirements. _________________ 28Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/07/20
Committee: JURI
Amendment 107 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. This regulation should also respect the competences of Member States to adopt laws promoting freedom and pluralism of the media as well as cultural and linguistic diversity. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/07/20
Committee: JURI
Amendment 114 #
Proposal for a regulation
Recital 11
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, in particular Directive (EU) 2019/790 on Copyright and Related Rights in Digital Single Market, which establish specific rules and procedures that should remain unaffected.
2021/07/20
Committee: JURI
Amendment 117 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadappropriately and also covers information relating to illegal content, products, services and activities where such information is itself illegal. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable lawUnion or national law as a result of its display on an intermediary service is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegaldue to its direct connection to or promotion of an illegal activity, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non- compliant or counterfeit products, illegal trading of animals, plants and substances, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law including the Charter of Fundamental Rights of the European Union and what the precise nature or subject matter is of the law in question.
2021/07/20
Committee: JURI
Amendment 126 #
Proposal for a regulation
Recital 12 a (new)
(12 a) Material disseminated for educational, journalistic, artistic or research purposes or for the purposes of preventing or countering illegal content including the content which represents an expression of polemic or controversial views in the course of public debate should not be considered as illegal content. Similarly, material, such as an eye-witness video of a potential crime, should not be considered as illegal, merely because it depicts an illegal act. An assessment should determine the true purpose of that dissemination and whether material is disseminated to the public for those purposes.
2021/07/20
Committee: JURI
Amendment 129 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, search engines, livestreaming platforms, messaging services or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/07/20
Committee: JURI
Amendment 132 #
Proposal for a regulation
Recital 13 a (new)
(13 a) Additionally in order to avoid imposing obligations simultaneously on two providers for the same content, a hosting services should be defined as an online platform when it has a direct relationship with the recipient of the service. A hosting provider who is acting as the infrastructure for an online platform should not be considered as an online platform based on this relationship, where it implements the decisions of the online platform and its user indirectly.
2021/07/20
Committee: JURI
Amendment 133 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a large or potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admission to a user group, such information should only be considered to be publicly available when users seeking to access such information are automatically registered or admitted without human intervention to decide or select the users to whom access is granted. The mere possibility to create groups of users of a given service, including a messaging service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finlimited number of pre- determined persons taking into account the potential for groups to become tools for wide dissemination of content to the public. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation where they do not meet the above criteria for "dissemination to the public". Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. File-sharing services and other cloud services fall within the scope of this Regulation, to the extent that such services are used to make the stored information available to the public at the direct request of the content provider. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/20
Committee: JURI
Amendment 156 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken in order to detect, identify and act against illegal content on a voluntary basis should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/07/20
Committee: JURI
Amendment 165 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case, where set down in Union acts and, in particular, does not affect orders by national authorities in accordance with national legislation that implements European acts, in accordance with the conditions established in this Regulation and other European lex specialis. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Equally, nothing in this Regulation should prevent providers from enacting end-to-end encrypting of their services.
2021/07/20
Committee: JURI
Amendment 166 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. This should be without prejudice to decisions of Member States to require service providers, who host information provided by users of their service, to apply due diligence measures.
2021/07/20
Committee: JURI
Amendment 172 #
Proposal for a regulation
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws in conformity with the Union law, including the Charter of Fundamental Rights of the European Union on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
2021/07/20
Committee: JURI
Amendment 174 #
Proposal for a regulation
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the effective processing of those orders.
2021/07/20
Committee: JURI
Amendment 175 #
Proposal for a regulation
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, including the Charter of Fundamental Rights of the European Union and in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The competent authorities of Member States should be able to object to the Board orders to act against illegal content, that they consider are in breach of Union law, including the Charter. The procedure for objection should be simplified and fast- tracked when such orders are issued from an administrative or judicial authority of a Member State that is under an Article 7 procedure for infringement of European values pursuant to article 2 of TEU. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
2021/07/20
Committee: JURI
Amendment 180 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law in conformity with the Union law, including the EU Charter on Fundamental Rights enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of national, Union law or international law and the interests of international comity.
2021/07/20
Committee: JURI
Amendment 183 #
Proposal for a regulation
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information as defined in Union or national law in accordance with Union law, including the Charter of Fundamental Rights of the European Union, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
2021/07/20
Committee: JURI
Amendment 185 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors, women and vulnerable users, such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/20
Committee: JURI
Amendment 191 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. At the same time, recipients should enter into such agreements willingly without any misleading or coercive tactics and therefore a ban on dark patterns should be introduced.
2021/07/20
Committee: JURI
Amendment 193 #
Proposal for a regulation
Recital 38 a (new)
(38 a) While an additional requirement should apply to very large online platforms, all providers should do a general self-assessment of potential risks related to their services, especially in relations with minors and should take voluntary mitigation measures where appropriate. In order to ensure that the provider undertakes these actions, Digital Services Coordinators may ask for proof.
2021/07/20
Committee: JURI
Amendment 194 #
Proposal for a regulation
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 which do not also qualify as very large online platforms. In any public versions of such reports, providers of intermediary services should remove any information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider’s terms and conditions. _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/20
Committee: JURI
Amendment 197 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. Furthermore, the notice and action mechanism should be complemented by ‘stay down’ provisions whereby providers of hosting services should demonstrate their best efforts in order to prevent from reappearing content which is identical to another piece of content that has already been identified and removed by them as illegal. The application of this requirement should not lead to any general monitoring obligation.
2021/07/20
Committee: JURI
Amendment 202 #
Proposal for a regulation
Recital 40 a (new)
(40 a) Notices should be directed to the actor that has the technical and operational ability to act and the closest relationship to the recipient of the service that provided the information or content, such as to an online platform and not to the hosting service provider on which provides services to that online platform. Such hosting service providers should redirect such notices to the particular online platform and inform the notifying party of this fact.
2021/07/20
Committee: JURI
Amendment 203 #
Proposal for a regulation
Recital 40 b (new)
(40 b) Hosting providers should seek to act only against the items of information notified. This may include acts such as disabling hyperlinking to the items of information. Where the removal or disabling of access to individual items of information is technically or operationally unachievable due to legal, contractual, or technological reasons, such as encrypted file and data storage and sharing services, hosting providers should inform the recipient of the service of the notification and seek action. If a recipient fails to act or delays action, or the provider has reason to believe has failed to act or otherwise acts in bad faith, the hosting provider may suspend their service in line with their terms and conditions.
2021/07/20
Committee: JURI
Amendment 205 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
2021/07/20
Committee: JURI
Amendment 206 #
Proposal for a regulation
Recital 41 a (new)
(41 a) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, either because it is illegal or is not allowed under its terms and conditions, it should do so in a timely manner, taking into account the potential harm of the infraction and the technical abilities of the provider. Information that could have a negative effect on minors, women and vulnerable users such as those with protected characteristics under Article 21 of the Charter should be seen as a matter requiring urgency
2021/07/20
Committee: JURI
Amendment 211 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that have proven to be efficient, proportionate and reliable, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/07/20
Committee: JURI
Amendment 220 #
Proposal for a regulation
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of entering, in good faith, an out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies located in either the Member State of the recipient or the provider and that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/07/20
Committee: JURI
Amendment 221 #
Proposal for a regulation
Recital 11
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, in particular Directive (EU) 2019/790 on Copyright and Related Rights in Digital Single Market, which establish specific rules and procedures that should remain unaffected.
2021/07/08
Committee: IMCO
Amendment 230 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law, including the EU Charter on Fundamental Rights and what the precise nature or subject matter is of the law in question.
2021/07/08
Committee: IMCO
Amendment 231 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and effective safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/20
Committee: JURI
Amendment 232 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a an imminent threat to the life or safety of person, notably when it concerns vulnerable users, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing upon request all relevant information available to it, including where relevant the content in question and an explanation of its suspicion and unless instructed otherwise, should remove or disable the content. Information obtained by a law enforcement or judicial authority of a Member State in accordance with this Article should not be used for any purpose other than those directly related to the individual serious criminal offence notified. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/07/20
Committee: JURI
Amendment 235 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, notably when it concerns vulnerable users, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/07/20
Committee: JURI
Amendment 236 #
Proposal for a regulation
Recital 48 a (new)
(48 a) Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall remove or disable the content and promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all available relevant information.
2021/07/20
Committee: JURI
Amendment 238 #
Proposal for a regulation
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selloffering products, digital content on a commercial scale, or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with tradermarketplaces should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platformand accurate to the providers of online marketplaces, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
2021/07/20
Committee: JURI
Amendment 239 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, search engines, livestreaming platforms, messaging services or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/07/08
Committee: IMCO
Amendment 239 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platformproviders of online marketplaces covered should mtake reasonable effective steps that would be reasonably taken to by a diligent operator in accordance with a high industry standard of professional diligence, to regularly verify the accuracy, currency andefforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platformproviders of online marketplaces covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable effortsproviders, which have taken effective steps that would be reasonably taken to by a diligent operator in accordance with a high industry standard of professional diligence, required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platformProviders of online marketplaces should also design and organise their online interface in a user- friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . The online interface should allow traders to provide the information referred to in Article 22a of this Regulation, the information referred to in Article 6 of Directive 2011/83/EU on Consumers Rights, information on sustainability of products, and information allowing for the unequivocal identification of the product or the service, including labelling requirements, in compliance with legislation on product safety and product compliance. _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/07/20
Committee: JURI
Amendment 240 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a user- friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . The online interface should allow traders to provide the information referred to in Article 22a of this Regulation, the information referred to in Article 6 of Directive2011/83/EU on Consumers Rights, information on sustainability of products, and information allowing for the unequivocal identification of the product or the service, including labelling requirements, in compliance with legislation on product safety and product compliance. _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/07/20
Committee: JURI
Amendment 242 #
Proposal for a regulation
Recital 50 a (new)
(50 a) Providers of online marketplaces should demonstrate their best efforts to prevent the dissemination by traders of illegal products and services. In compliance with the no general monitoring principle, providers should inform recipients when the service or product they have acquired through their services is illegal. Once notified of an illegal product or service as foreseen in Article 14, providers of online marketplaces should take effective and proportionate measures to prevent such products or services from reappearing on their online marketplace.
2021/07/20
Committee: JURI
Amendment 244 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a large or potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admission to a user group, such information should only be considered to be publicly available when users seeking to access such information are automatically registered or admitted without human intervention to decide or select the users to whom access is granted. The mere possibility to create groups of users of a given service, including a messaging service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finlimited number of pre- determined persons taking into account the potential for groups to become tools for wide dissemination of content to the public. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation where they do not meet the above criteria for "dissemination to the public". Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. File-sharing services and other cloud services fall within the scope of this Regulation, to the extent that such services are used to make the stored information available to the public at the direct request of the content provider. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/08
Committee: IMCO
Amendment 247 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/20
Committee: JURI
Amendment 248 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, including regarding misleading information or any other types of harmful content there being no alternative and less restrictive measures that would effectively achieve the same result.
2021/07/20
Committee: JURI
Amendment 256 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products or the display of copyright-infringing content. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition or the way platforms' terms and conditions including content moderation policies, are enforced, including through automatic means. With respect to this category of risks, a particular attention should be paid to the detrimental effect of intimidation of independent press and the harassment of journalists, in particular women who are more often victims of hateful speech and online threats. These should be considered systemic risk as referred to in Article 26 as they pose threat to democratic values, media freedom, freedom of expression and information, and should be subject to dedicated mitigating measures as referred to in Article 27, and priority notice through trusted flaggers as referred to in Article 19. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, fundamental rights, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/20
Committee: JURI
Amendment 260 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditionprevent the manipulation and exploitation of the service, including by the amplification of content which is counter to their terms and conditions, adapting their decision-making processes, or adapting their terms and conditions and content moderation policies and how those policies are enforced, while being fully transparent to the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources, including by displaying related public service advertisements instead of other commercial advertisements. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/20
Committee: JURI
Amendment 261 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and condition and intentional manipulation and exploitation of the service, including amplification of harmful content, adapting their decision-making processes, or adapting their terms and conditions, as well as making content moderation policies and the way they are enforced fully transparent for the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/20
Committee: JURI
Amendment 273 #
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Moreover, these recommender systems can also impact media consumption and cultural practices of users, and may risk locking them into a bubble without providing them with the possibility to open up to different kind of content. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
2021/07/19
Committee: JURI
Amendment 278 #
Proposal for a regulation
Recital 62 a (new)
(62 a) The practice of very large online platforms to associate advertisement with content uploaded by users could indirectly lead to the monetisation and promotion of illegal content, or content that is in breach of their terms and conditions and could risk to considerably damage the brand image of the buyers of advertising space. In order to prevent such practice, the very large online platforms should ensure, including through standard contractual guarantees to the buyers of advertising space, that the content to which they associate advertisements is legal, and compliant with their terms and conditions. Furthermore, the very large online platforms should allow advertisers to have direct access to the results of audits carried out independently and evaluating the commitments and tools of platforms for protecting the brand image of the buyers of advertising space ('brand safety').
2021/07/19
Committee: JURI
Amendment 280 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. In addition, very large online platforms should label any known deep fake videos, audio or other files.
2021/07/19
Committee: JURI
Amendment 283 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, which mean the conditions set out in this Regulation. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/19
Committee: JURI
Amendment 284 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms, such as the dissemination of illegal and amplification of harmful content brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint- handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/19
Committee: JURI
Amendment 295 #
Proposal for a regulation
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation, as well as the compliance of online platforms with the provisions of these codes. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
2021/07/19
Committee: JURI
Amendment 299 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken in order to detect, identify and act against illegal content on a voluntary basis should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/07/08
Committee: IMCO
Amendment 299 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation, harmful content or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
2021/07/19
Committee: JURI
Amendment 303 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, since the Commission willhas issued guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan in May 2021.
2021/07/19
Committee: JURI
Amendment 313 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. This should be without prejudice to decisions of Member States to require service providers, who host information provided by users of their service, to apply due diligence measures.
2021/07/08
Committee: IMCO
Amendment 314 #
Proposal for a regulation
Recital 76
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV and Article 8 and 9 by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction.
2021/07/19
Committee: JURI
Amendment 316 #
Proposal for a regulation
Recital 77
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. Member States should also consider specialised training, in cooperation with Union bodies, offices and agencies, for relevant national authorities, in particular administrative authorities, who are responsible for issuing orders to act against illegal content and provide information.
2021/07/19
Committee: JURI
Amendment 317 #
Proposal for a regulation
Recital 78
(78) Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. In order to ensure coherence between the Member States, the Commission should adopt guidance on the procedures and rules related to the powers of Digital Services Coordinators.
2021/07/19
Committee: JURI
Amendment 322 #
Proposal for a regulation
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws in conformity with the Union law, including the EU Charter on Fundamental Rights on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to thensure the effective processing of those orders.
2021/07/08
Committee: IMCO
Amendment 326 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, intellectual property, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/07/19
Committee: JURI
Amendment 328 #
Proposal for a regulation
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, including the EU Charter on Fundamental Rights and in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The competent authorities of Member States should be able to object to the Board orders to act against illegal content, that they consider are in breach of the Union law, including the Charter. The procedure for objection should be simplified and fast-tracked when such orders are issued from an administrative or judicial authority of a Member State that is under an Article 7 procedure for infringement of European values according to Article 2 of TEU. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
2021/07/08
Committee: IMCO
Amendment 333 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law in conformity with the Union law, including the EU Charter on Fundamental Rights enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of national, Union law or international law and the interests of international comity.
2021/07/08
Committee: IMCO
Amendment 336 #
Proposal for a regulation
Recital 97 a (new)
(97 a) The Commission should ensure that it is independent and impartial in its decision making in regards to both Digital Services Coordinators and providers of services under this Regulation.
2021/07/19
Committee: JURI
Amendment 337 #
Proposal for a regulation
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the Commission should have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties, including the right to challenge any investigative requests before a judicial authority within the Member State of establishment.
2021/07/19
Committee: JURI
Amendment 339 #
Proposal for a regulation
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information as defined in Union or national law in conformity with the Union law, including the EU Charter on Fundamental Rights, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
2021/07/08
Committee: IMCO
Amendment 344 #
Proposal for a regulation
Recital 104
(104) In order to fulfil the objectives of this Regulation, the power to adopt acts in accordance with Article 290 of the Treaty should be delegated to the Commission to supplement this Regulation. In particular, delegated acts should be adopted in respect of criteria for identification of very large online platforms and of technical specifications for access requests. It is also equally important that, when standardisation bodies are unable to agree the standards needed to implement this Regulation fully, the Commission chooses to adopt delegated acts. It is of particular importance that the Commission carries out appropriate consultations and that those consultations be conducted in accordance with the principles laid down in the Interinstitutional Agreement on Better Law-Making of 13 April 2016. In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States' experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts.
2021/07/19
Committee: JURI
Amendment 348 #
1. This Regulation lays down harmonised rules on the provision of intermediary services in the internal marketorder to improve the functioning of the internal market whilst ensuring the rights enshrined in the Charter of Fundamental Rights of the European Union, in particular the freedom of expression and information in an open and democratic society. In particular, it establishes:
2021/07/19
Committee: JURI
Amendment 351 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform harmonised rules for a safe, predictable, accessible and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/07/19
Committee: JURI
Amendment 358 #
Proposal for a regulation
Article 1 – paragraph 5 – point b
(b) Directive 2010/13/ECU as amended by Directive 2018/1808/EU;
2021/07/19
Committee: JURI
Amendment 360 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 on Copyright and Related Rights in Digital Single Market;
2021/07/19
Committee: JURI
Amendment 361 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
(c) Union law on copyright and related rights; , in particular Directive (EU) 2019/790 on Copyright and Related Rights in Digital Single Market;
2021/07/19
Committee: JURI
Amendment 363 #
Proposal for a regulation
Article 1 – paragraph 5 – point h
(h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation (EU) 2019/1020 and Regulation XXX (General Product Safety Regulation);
2021/07/19
Committee: JURI
Amendment 364 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
(i a) Directive (EU) 2019/882
2021/07/19
Committee: JURI
Amendment 366 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
5 a. The Commission shall by [within one year of the adoption of this Regulation] publish guidelines with regard to the relations between this Regulation and legislative acts listed in Article 1(5). These guidelines shall clarify any potential conflicts between the conditions and obligations listed in those legislative acts and which act prevails where actions, in line with this Regulation, fulfil the obligations of another legislative act and which regulatory authority is competent.
2021/07/19
Committee: JURI
Amendment 375 #
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession, or any natural or legal person that is offering goods, digital content, or services on a commercial scale;
2021/07/19
Committee: JURI
Amendment 377 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. Furthermore, the notice and action mechanism should be complemented by ‘stay down’ provisions whereby providers of hosting services should demonstrate their best efforts in order to prevent from reappearing content which is identical to another piece of content that has already been identified and removed by them as illegal. The application of this requirement should not lead to any general monitoring obligation.
2021/07/08
Committee: IMCO
Amendment 379 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3 a (new)
- an ‘online search engine’ as defined in point (5) of Article 2 of Regulation (EU) 2019/1150;
2021/07/19
Committee: JURI
Amendment 380 #
Proposal for a regulation
Article 2 – paragraph 1 – point f a (new)
(f a) 'live streaming platform services' mean information society services of which the main or one of the main purposes is to give the public access to audio or video material that is live broadcasted by its users, which it organises and promotes for profit-making purposes;
2021/07/19
Committee: JURI
Amendment 381 #
Proposal for a regulation
Article 2 – paragraph 1 – point f b (new)
(f b) 'private messaging services' mean number-independent interpersonal communications services as defined in Article 2(7) of Directive (EU) 2018/1972, excluding transmission of electronic mail as defined in Article 2 (h) of Directive 2002/58/EC;
2021/07/19
Committee: JURI
Amendment 391 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(h a) ‘online marketplace’ means an online platform that allows consumers to conclude distance contracts with other traders or consumers on their platform;
2021/07/19
Committee: JURI
Amendment 394 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that have been proven to be efficient, proportionate and reliable, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/07/08
Committee: IMCO
Amendment 396 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
(i a) ‘deep fake’ means an image, audio or video content that has been generated or manipulated using artificial intelligence tools to appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful;
2021/07/19
Committee: JURI
Amendment 406 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(q a) ‘dark pattern’ means a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision- making or choice.
2021/07/19
Committee: JURI
Amendment 408 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(q b) ‘deep fake’ means a generated or manipulated image, audio or video content that appreciably resembles existing persons,objects, places or other entities or events and falsely appears to a person to be authentic or truthful;
2021/07/19
Committee: JURI
Amendment 409 #
Proposal for a regulation
Article 2 – paragraph 1 – point q b (new)
(q b) 'minor' means a child below the age of 16, as established in Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 410 #
Proposal for a regulation
Article 2 – paragraph 1 – point q c (new)
(q c) ‘persons with disabilities’ means persons with disabilities within the meaning of Article 3(1) of Directive (EU) 2019/882
2021/07/19
Committee: JURI
Amendment 423 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and effective safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/08
Committee: IMCO
Amendment 426 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traderproviders of online marketplaces, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platformmarketplace itself or by a recipient of the service who is acting under its authority or control.
2021/07/19
Committee: JURI
Amendment 429 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, notably when it concerns vulnerable users, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/07/08
Committee: IMCO
Amendment 433 #
Proposal for a regulation
Recital 48 a (new)
(48a) Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it should remove or disable the content and promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all available relevant information.
2021/07/08
Committee: IMCO
Amendment 433 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including thoseor national law, in accordance with Union law, including the Charter of Fundamental Rights of the European Union, and the requirements set out in this Regulation.
2021/07/19
Committee: JURI
Amendment 437 #
Proposal for a regulation
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with tradermarketplaces should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platformproviders of online marketplaces, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
2021/07/08
Committee: IMCO
Amendment 440 #
Proposal for a regulation
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. This Regulation shall not prevent providers from offering end- to-end encrypted services. The provision of such services shall not constitute a reason for liability or for becoming ineligible for the exemptions from liability.
2021/07/19
Committee: JURI
Amendment 441 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platformproviders of online marketplaces covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platformproviders of online marketplaces covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platformproviders, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platformProviders of online marketplaces should also design and organise their online interface in a user- friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . The online interface should allow traders to provide the information referred to in Article 22a of this Regulation, the information referred to in Article 6 of Directive 2011/83/EU on Consumers Rights, information on sustainability of products, and information allowing for the unequivocal identification of the product or the service, including labelling requirements, in compliance with legislation on product safety and product compliance. __________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/07/08
Committee: IMCO
Amendment 447 #
Proposal for a regulation
Article 8 – paragraph 1 a (new)
1 a. If the provider cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that has issued the order.
2021/07/19
Committee: JURI
Amendment 448 #
Proposal for a regulation
Article 8 – paragraph 1 b (new)
1 b. Where the provider does not have its main establishment or legal representative in the Member State of the competent authority that has issued the order and the provider believes that the implementation of an order issued under paragraph 1 would infringe the Charter of Fundamental rights of the European Union, Union law, or the national law of the Member State in which the main establishment or legal representative of the provider is located, or does not meet the conditions of paragraph 2, the provider shall have the right to submit a reasoned request for a decision of the Digital Services Coordinator from the Member State of establishment. The provider shall inform the authority issuing the order of this submission.
2021/07/19
Committee: JURI
Amendment 449 #
Proposal for a regulation
Recital 50 a (new)
(50a) Providers of online marketplaces should demonstrate their best efforts to prevent the dissemination by traders of illegal products and services. In compliance with the no general monitoring principle, providers should inform recipients when the service or product they have acquired through their services is illegal. Once notified of an illegal product or service as foreseen in Article 14, providers of online marketplaces should take effective and proportionate measures to prevent such products or services from reappearing on their online marketplace.
2021/07/08
Committee: IMCO
Amendment 449 #
1 c. Upon receiving such a submission, the Digital Services Coordinator shall in a timely manner scrutinise the order and inform the provider of its decision. Where the Digital Services Coordinator agrees with the reasoning of the provider, in whole or in part, the Digital Services Coordinator shall inform, without undue delay, the Digital Services Coordinator of the Member State of the judicial or administrative authority issuing the order of its objection. The Digital Services Coordinator may choose to intervene on behalf of the provider in any redress, appeal or other legal processes in relations to the order.
2021/07/19
Committee: JURI
Amendment 451 #
1 e. Paragraphs 1b and 1c shall not apply in the case of very large online platforms or where a content is manifestly illegal under Union law.
2021/07/19
Committee: JURI
Amendment 453 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
- the identification of the issuing authority and the means to verify the authentication of the order;
2021/07/19
Committee: JURI
Amendment 457 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/08
Committee: IMCO
Amendment 464 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider.
2021/07/19
Committee: JURI
Amendment 467 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, including regarding misleading information or any other types of harmful content there being no alternative and less restrictive measures that would effectively achieve the same result.
2021/07/08
Committee: IMCO
Amendment 468 #
Proposal for a regulation
Article 8 – paragraph 2 – point c b (new)
(c b) where more than one provider of intermediary services is responsible for hosting the specific item, the order is issued to the most appropriate provider that has the technical and operational ability to act against the specific item.
2021/07/19
Committee: JURI
Amendment 469 #
Proposal for a regulation
Article 8 – paragraph 2 a (new)
2 a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders.
2021/07/19
Committee: JURI
Amendment 470 #
Proposal for a regulation
Article 8 – paragraph 2 b (new)
2 b. Member States shall ensure that providers have a right to appeal and object to implementing the order and shall facilitate the use and access to that right.
2021/07/19
Committee: JURI
Amendment 472 #
Proposal for a regulation
Article 8 – paragraph 3
3. The Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the orders referred to in paragraph 1 to all other Digital Services Coordinators through the system established in accordance with Article 67. Where upon receiving the copy of the order, at least three Digital Services Coordinators consider that the order violates Union or national law that is in conformity with Union law, including the Charter, they can object the enforcement of the order to the Board, based on a reasoned statement.Following recommendation of the Board, the Commission may decide whether the order should be enforced. Where the order to act against a specific item of illegal content under Union or national law has been issued by the national judicial or administrative authority of a Member State that is under an Article 7 procedure for infringement of European values according to Article 2 of TEU, any Digital Service Coordinator may object the order directly to the Commission. The Commission shall assess the objection to the order as a matter of priority and decide whether the order should be enforced as swiftly as possible and no later than 48 hours upon receipt of the objection.
2021/07/19
Committee: JURI
Amendment 474 #
Proposal for a regulation
Article 8 – paragraph 3 – subparagraph 1 (new)
Where upon receiving the copy of the order, at least three Digital Services Coordinators consider that the order violates Union or national law that is in conformity with Union law, including the Charter, they can object the enforcement of the order to the Board, based on a reasoned statement. Following recommendation of the Board, the Commission may decide whether the order is to be enforced.
2021/07/19
Committee: JURI
Amendment 476 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products or the display of copyright-infringing content. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition, hampering competition or the way platforms' terms and conditions including content moderation policies, are enforced, including through automatic means. With respect to this category of risks, a particular attention should be paid to the detrimental effect of intimidation of independent press and the harassment of journalists, in particular women who are more often victims of hateful speech and online threats. These should be considered systemic risk as referred to in Article 26 as they pose threat to democratic values, media freedom, freedom of expression and information, and should be subject to dedicated mitigating measures as referred to in Article 27, and priority notice through trusted flaggers as referred to in Article 19. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, fundamental rights, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 478 #
Proposal for a regulation
Article 8 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative law in conformity with Union law, including the Charter of Fundamental Rights. While acting in accordance with such laws, authorities shall not go beyond what is necessary in order to attain the objectives followed therein.
2021/07/19
Committee: JURI
Amendment 484 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and condition and intentional manipulation and exploitation of the service, including amplification of harmful content, adapting their decision-making processes, or adapting their terms and conditions, as well as making content moderation policies and the way they are enforced fully transparent for the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/08
Committee: IMCO
Amendment 485 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. Where no effect has been given to the order, providers of intermediary services shall provide without delay the authority of issuing the order with a statement of reasons as to why the order was not given an effect.
2021/07/19
Committee: JURI
Amendment 489 #
Proposal for a regulation
Article 9 – paragraph 1 a (new)
1 a. If the provider cannot comply with the information order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the authority that issued the information order
2021/07/19
Committee: JURI
Amendment 490 #
Proposal for a regulation
Article 9 – paragraph 1 b (new)
1 b. Where the provider does not have its main establishment or legal representative in the Member State of the competent authority that issued the order and a provider believes that the implementation of an order issued under paragraph 1 would infringe the Charter, Union law, or the national law of the Member State in which the main establishment or legal representative of the provider is located, or does not meet the conditions of paragraph 2, the provider shall have the right to submit a reasoned request for a decision of the Digital Services Coordinator from the Member State of establishment. The provider shall inform the authority issuing the order of this submission.
2021/07/19
Committee: JURI
Amendment 491 #
Proposal for a regulation
Article 9 – paragraph 1 c (new)
1 c. Upon receiving such a submission, the Digital Services Coordinator shall in a timely manner scrutinise the order and inform the provider of its decision. Where the Digital Services Coordinator agrees with the reasoning of the provider, in whole or in part, the Digital Services Coordinator shall inform of its objection, without undue delay, the Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order. The Digital Services Coordinator may choose to intervene on behalf of the provider in any redress, appeal or other legal processes in relations to the order.
2021/07/19
Committee: JURI
Amendment 495 #
Proposal for a regulation
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Moreover, these recommender systems can also impact media consumption and cultural practices of users, and may risk locking them into a bubble without providing them with the possibility to open up to other content. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
2021/07/08
Committee: IMCO
Amendment 495 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent -1 (new)
-1 the identification of the issuing authority and the means to verify the authentication of the order;
2021/07/19
Committee: JURI
Amendment 498 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective foraccording to which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
2021/07/19
Committee: JURI
Amendment 501 #
Proposal for a regulation
Recital 63 a (new)
(63a) The practice of very large online platforms to associate advertisement with content uploaded by users, could indirectly lead to the promotion of illegal content, or content that is in breach of their terms and conditions and could risk to considerably damage the brand image of the buyers of advertising space. In order to prevent such practice, the very large online platforms should ensure, including through standard contractual guarantees to the buyers of advertising space, that the content to which they associate advertisements is legal, and compliant with their terms and conditions. Furthermore, the very large online platforms should allow advertisers to have access to the results of audits carried out independently and evaluating the commitments and tools of platforms for protecting the brand image of the buyers of advertising space ("brand safety").
2021/07/08
Committee: IMCO
Amendment 503 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms, such as the dissemination of illegal and amplification of harmful content brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint- handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/08
Committee: IMCO
Amendment 505 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider;
2021/07/19
Committee: JURI
Amendment 506 #
Proposal for a regulation
Article 9 – paragraph 2 – point c a (new)
(c a) the order is issued only where no other effective means are available to receive the same specific item of information
2021/07/19
Committee: JURI
Amendment 507 #
Proposal for a regulation
Article 9 – paragraph 2 a (new)
2 a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific template and form for such orders. It shall ensure that form means the standards set down in the Annex of [XXX the regulation on European Production and Preservation Orders for electronic evidence in criminal matters].
2021/07/19
Committee: JURI
Amendment 516 #
Proposal for a regulation
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation, as well as the compliance of online platforms with the provisions of these codes. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
2021/07/08
Committee: IMCO
Amendment 519 #
Proposal for a regulation
Article -10 (new)
Article -10 Waiver 1. Providers of intermediary services may apply to the Commission for a waiver from the requirements of Chapter III, if they prove that they are: (a) non-for-profit or equivalent and serve a manifestly positive role in the public interest; (b) micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC; or (c) a medium enterprises within the meaning of the Annex to Recommendation 2003/361/EC without any systemic risk related to illegal content. The Providers shall present justified reasons for their request. 2. The Commission shall examine such an application and, after consulting the Board, may issue a waiver in whole or in parts to the requirements of this Chapter. 3. Upon the request of the Board or the provider, or on its own initiative, the Commission may review a waiver issued and revoke the waiver in whole or in parts. 4. The Commission shall maintain a list of all waivers issued and their conditions and shall publish this list to the public.
2021/07/19
Committee: JURI
Amendment 522 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation, harmful content or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
2021/07/08
Committee: IMCO
Amendment 523 #
Proposal for a regulation
Article 10 – paragraph 2 a (new)
2 a. Providers of intermediary services may establish the same single point of contact for this Regulation and another single point of contact as required under other Union law. When doing so, the provider shall inform the Commission of this decision.
2021/07/19
Committee: JURI
Amendment 527 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, since the Commission willhas issued guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan in May 2021.
2021/07/08
Committee: IMCO
Amendment 535 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format, in a searchable archive of all the previous versions with their date of application.
2021/07/19
Committee: JURI
Amendment 539 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1 a. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, and available remedies including applicable alternative dispute resolution mechanisms. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions, including information on the available remedies and the possibilities for opt-out, where relevant.
2021/07/19
Committee: JURI
Amendment 540 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1 a. Providers of intermediary services shall ensure that their terms and conditions are written in unambiguous and comprehensible language and prevent the recipients of their services from providing information that is not compliant with Union law or the law of the Member State where the information is provided. Any additional restrictions that providers of intermediary services may impose in relation to the use of their service and the information provided by the recipients of the service shall be in full compliance with the fundamental rights of the recipients of the services as enshrined in the Charter of Fundamental Rights of the European Union.
2021/07/19
Committee: JURI
Amendment 542 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall actpply and enforce the restrictions referred to in paragraph 2 in a diligent, objective and, timely, proportionate manner in applying and enforcing the restrictions referred to in paragraph 1d non- discriminatory manner, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charternational and Union law, including the EU Charter on Fundamental Rights.
2021/07/19
Committee: JURI
Amendment 551 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2b. Providers of intermediary services shall refrain from any dark patterns or other techniques to encourage the acceptance of terms and conditions, including giving consent to sharing personal and non-personal data.
2021/07/19
Committee: JURI
Amendment 559 #
Proposal for a regulation
Article 12 a (new)
Article 12a General Risk Assessment and Mitigation Measures 1. Providers of intermediary services shall identify, analyse and assess, at least once a year and at each significant revision of a service they provide thereafter, the potential misuse or other risks stemming from the functioning and use made of their services in the Union. Such a general risk assessment shall be specific to each of their services and shall include at least risks related to the dissemination of illegal content through their services and any contents that might have a negative effect on potential recipients of the service, especially minors and gender equality. 2. Providers of intermediary services shall wherever possible, attempt to put in place reasonable, proportionate and effective mitigation measures to the risk identified in line with applicable law and their terms and conditions. 3. Where the identified risk relations to minor recipients of the service, without regard to if the minor is acting with respect to the terms and conditions, mitigation measures shall include, where needed and applicable: (a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure those prioritise the best interests of the minor; (b) adapting or removing system design features that expose or promote to minors to content, contact, conduct and contract risks that impair the physical, mental or moral development; (c) ensuring the highest levels of privacy, safety, consumer protection and security by design and default for individual recipients of the service under the age of 18. (d) if a service is targeted at minors, provide child-friendly mechanisms for remedy and redress, including easy access to expert advice and support. 4. Providers of intermediary services shall, upon request, explain to the competent Digital Services Coordinator, how it undertook this risk assessment and what mitigation measures it undertook.
2021/07/19
Committee: JURI
Amendment 560 #
Proposal for a regulation
Article 12 b (new)
Article 12b Fair consent choice screens 1. Providers of intermediary services that ask the recipients of their service for consent as required by Regulation (EU) 2016/679 to process personal data concerning them shall ensure that the end user choice screens shown to that end are designed in a fair and neutral manner and do not in any way subvert or impair user autonomy, decision-making, or choice via the choice screens’ structure, function or manner of operation. 2. A choice or decision made by the recipient of the service using an online interface or part thereof that does not comply with the requirements of paragraph 1 shall not constitute consent in the sense of Regulation (EU) 2016/679. 3. Paragraphs 1 and 2 shall also apply to consent given prior to the entry into force of this Regulation 4. The Commission may adopt implementing acts to prescribe binding design aspects and functions of consent choice screens that fulfil the requirements of paragraph 1. 5. Providers of intermediary services shall accept the communication of consent choices made by the recipient of the service through automated means, including through standardised digital signals sent by the recipient’s software used to access the service such as web browsers and operating systems. 6. Providers of intermediary services shall respect the communication of choices made by the recipients of the service, including consent or withdrawal of consent to the processing of personal data, through automated means, such as through the settings of software placed on the market permitting electronic communications, including the retrieval and presentation of information on the internet. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions for automated means referred to above. 7. The Board, in cooperation with the Commission, shall publish official guidelines to indicate specific design patterns that qualify as subverting or impairing the autonomy, decision making, or choice of the recipients of the service. The Board shall keep this list updated in the light of technological developments and, in the case of very large online platforms, assessments related to systemic risks identified in accordance with Article 27(2). 8. The Commission may adopt implementing acts to prescribe the design and functions of online interfaces that facilitate expression of consent in the sense of Regulation (EU) 2016/679 or other choices that may be expressed by the recipients of the service. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70. Before the adoption of any measures pursuant to this paragraph, the Commission shall publish a draft thereof and invite all interested parties to submit their comments within the time period set out therein, which shall not be less than two months.
2021/07/19
Committee: JURI
Amendment 563 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed to inform taking the action specified in thoshe authority issuing the order of its receipt and the effect given to the orders;
2021/07/19
Committee: JURI
Amendment 565 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
2021/07/19
Committee: JURI
Amendment 578 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 and 1a shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/07/19
Committee: JURI
Amendment 579 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
2a. Where made available to the public, the annual transparency reports referred to in paragraph 1 shall not include information that may prejudice ongoing activities for the prevention, detection, or removal of illegal content or content counter to a hosting provider's terms and conditions.
2021/07/19
Committee: JURI
Amendment 585 #
Proposal for a regulation
Chapter III – Section 2 – title
2 Additional provisions applicable to providers of hosting services, including online platforms and to providers of live streaming platform services and of private messaging services
2021/07/19
Committee: JURI
Amendment 587 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content, or content that is in breach with their terms and conditions. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means. and may include: (a) a clearly identifiable banner or single reporting button, allowing users to notify quickly and easily the providers of these services of illegal content they have encountered; (b) providing information to the users on what is considered illegal content under Union and national law; (c) providing information to the users on available national public tools to signal illegal content to the competent authorities.
2021/07/19
Committee: JURI
Amendment 590 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality or the breach of the content in question with the terms and conditions. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
2021/07/19
Committee: JURI
Amendment 592 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content, or content that is in breach with providers' terms and conditions;
2021/07/19
Committee: JURI
Amendment 615 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services, of live streaming platform services and of private messaging services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, or in respect of the recipient of the service who provided this information, in a timely, diligent non-discriminatory and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
2021/07/19
Committee: JURI
Amendment 619 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Providers of hosting services, of live streaming platform services and of private messaging services shall demonstrate their best efforts to prevent from reappearing content which is identical to another piece of content that has already been identified and removed by them as illegal. The application of this requirement shall not lead to any general monitoring obligation.
2021/07/19
Committee: JURI
Amendment 624 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting 1. services decides to remove or, disable access to or otherwise restrict the visibility of specific items of information provided by the recipients of the service or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or, removing or disabling access to or reducing the visibility of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access or the restriction of visibility or the suspension or termination of monetization, of the decision and provide a clear and specific statement of reasons for that decision.
2021/07/19
Committee: JURI
Amendment 628 #
Proposal for a regulation
Article 15 – paragraph 1 a (new)
1a. When the removing or disabling access to specific items of information is followed by the transmission of these specific items of information in accordance with Article 15a, the requirement to inform the recipient set out in par.1 may be postponed by a period of six weeks in order to avoid interfere with potential ongoing criminal investigations. The period of six weeks can be renewed only following a motivated decision of the competent authority to which the specific items of information had been transmitted.
2021/07/19
Committee: JURI
Amendment 630 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the restriction of the visibility of, or the demonetisation of, the information and, where relevant, the territorial scope of the disabling of access or the restriction;
2021/07/19
Committee: JURI
Amendment 632 #
Proposal for a regulation
Article 1 – paragraph 5 – point c
(c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 on Copyright and Related Rights in Digital Single Market;
2021/07/08
Committee: IMCO
Amendment 638 #
Proposal for a regulation
Article 15 a (new)
Article 15a Preservation of content and related data, and mandatory transmission of specific items of information 1. Providers of hosting services shall store the illegal content which has been removed or access to which has been disabled as a result of content moderation, or of an order to act against a specific item of illegal content as referred to in Article 8, as well as any related data removed as a consequence of the removal of such illegal content, which are necessary for administrative or judicial review proceedings, including or out-of- court dispute settlement against a decision to remove or disable access to illegal content and related data. 2. The illegal content and related data, as referred to in paragraph 1, shall be stored for six months from the date of removal or disabling. The illegal content shall, upon request from the competent authority or court, be preserved for a further specified period only if and for as long as necessary for ongoing administrative or judicial review proceedings, as referred to in paragraph 1. 3. Providers of hosting services shall ensure that the illegal content and related data stored pursuant to paragraph 1 are subject to appropriate technical and organisational safeguards. Those technical and organisational safeguards shall ensure that the illegal content and related data stored are accessed and processed only for the purposes referred to in paragraph 1, and ensure a high level of security of the personal data concerned. Providers of hosting services shall review and update those safeguards where necessary. 4. Providers of hosting services shall transmit to the competent authorities of the Member States the illegal content which has been removed or access to which has been disabled, whether such removing or disabling access is a result of a voluntary content moderation or of a use of the notification and action mechanism referred to in Article 14. This obligation of transmission applies under the following conditions: (a) illegal content referred to in this paragraph means content which is manifestly illegal and is an offense according to [Framework Decision 2008/913/JHA and Directive 2011/36/EU]; and (b) the competent law enforcement authority to which to transmit such illegal content is that of the Member State of the residence or establishment of the person who made the illegal content available, or, failing that, the law enforcement authority of the Member State in which the provider of hosting services is established or has its legal representative; or, failing that, the provider of hosting services shall inform Europol; (c) when the provider of hosting services is a very large online platform in accordance with section 4 of chapter III, it must also, when transmitting the illegal content, add an indicating flag for the illegal content which involve a threat to the life or safety of persons. 5. Each Member State shall notify to the Commission the list of its competent law enforcement authorities as referred to in paragraph 4.
2021/07/19
Committee: JURI
Amendment 641 #
Proposal for a regulation
Article 15 b (new)
Article 15b Notification of suspicions of criminal offences 1. Where provider of hosting service becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall remove or disable the content and promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 2. Where the provider of hosting service cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. 3. For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located. 4. For the purpose of this Article, Member States shall notify to the Commission the list of its competent law enforcement or judicial authorities.
2021/07/19
Committee: JURI
Amendment 649 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, as well as individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the followingdecision taken by the online platform not to act after having received a notice, and against the decisions taken by the online platform on the ground that the information provided by the recipients is illegal content under Union or national law, or incompatible with its terms and conditions:
2021/07/19
Committee: JURI
Amendment 655 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable access to or restrict the visibility of the information;
2021/07/19
Committee: JURI
Amendment 664 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to restrict the ability to monetize content provided by the recipients;
2021/07/19
Committee: JURI
Amendment 666 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
(cb) decisions of online marketplaces to suspend the provisions of their services to traders;
2021/07/19
Committee: JURI
Amendment 670 #
Proposal for a regulation
Article 17 – paragraph 1 a (new)
1a. When the decision to remove or disable access to the information is followed by the transmission of this information in accordance with Article 15a, the period of at least six months as set out in paragraph 1 shall be considered to start from the day on which the recipient was informed in accordance with Article 15(2).
2021/07/19
Committee: JURI
Amendment 677 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3 a (new)
- an ‘online search engine’ as defined in point (5) of Article 2 of Regulation (EU) 2019/1150;
2021/07/08
Committee: IMCO
Amendment 679 #
Proposal for a regulation
Article 2 – paragraph 1 – point f a (new)
(fa) live streaming platform services shall be defined as information society services of which the main or one of the main purposes is to give the public access to audio or video material that is live broadcasted by its users, which it organises and promotes for profit-making purposes;
2021/07/08
Committee: IMCO
Amendment 680 #
Proposal for a regulation
Article 2 – paragraph 1 – point f b (new)
(fb) private messaging services shall be defined as number-independent interpersonal communications services as defined in Article 2(7) of Directive (EU) 2018/1972, excluding transmission of electronic mail as defined in Article 2 (h) of Directive 2002/58/EC;
2021/07/08
Committee: IMCO
Amendment 682 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
The first subparagraph is without prejudice to the right of the recipient or individuals or entities that have submitted notices, concerned to redress against the decision before a court in accordance with the applicable law.
2021/07/19
Committee: JURI
Amendment 683 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
1a. Where a recipient seeks a resolved to multiple complaints, either party may request that the out-of-court dispute settlement body treats and resolves these complaints in a single dispute decision.
2021/07/19
Committee: JURI
Amendment 685 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms; and is legally distinct from and functionally independent of the government of the Member State or any other public or private body;
2021/07/19
Committee: JURI
Amendment 688 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and of individuals or entities that have submitted notices;
2021/07/19
Committee: JURI
Amendment 692 #
Proposal for a regulation
Article 18 – paragraph 2 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
2021/07/19
Committee: JURI
Amendment 704 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(ha) ‘online marketplace’ means an online platform that allows consumers to conclude distance contracts with other traders or consumers on their platform;
2021/07/08
Committee: IMCO
Amendment 707 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organiszational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed, assessed and decided upon with priority and without delayin priority and that best efforts are made to prevent future uploads of same illegal contents targeted by such notices, without prejudice to the implementation of a complaint and redress mechanism.
2021/07/19
Committee: JURI
Amendment 715 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content, as well as intentional manipulation and exploitation of the service in the sense of Article 26, paragraph 1(c);
2021/07/19
Committee: JURI
Amendment 719 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform;deleted
2021/07/19
Committee: JURI
Amendment 720 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform, law enforcement, or other government or relevant commercial entity;
2021/07/19
Committee: JURI
Amendment 728 #
Proposal for a regulation
Article 19 – paragraph 2 – point c c (new)
(cc) it publishes, at least once a year, clear, easily comprehensible and detailed reports on any notices submitted in accordance with Article 14 during the relevant period. The report shall list notices categorised by the identity of the hosting service provider, the type of alleged illegal or terms and conditions violating content concerned, and what action was taken by the provider. In addition, the reports hall identify relationships between the trusted flagger and any online platform, law enforcement, or other government or relevant commercial entity, and explain the means by which the trusted flagger maintains its independence.
2021/07/19
Committee: JURI
Amendment 730 #
Proposal for a regulation
Article 2 – paragraph 1 – point p a (new)
(pa) ‘deep fake’ means a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful;
2021/07/08
Committee: IMCO
Amendment 734 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
4a. Member States may recognise entities that were awarded the status of trusted flaggers in another Member State as a trusted flagger on their own territory. Upon request by a Member State, trusted flaggers can be awarded the status of European trusted flagger by the Board, in accordance with Article 48, par. 2. The Commission shall keep register of European trusted flaggers.
2021/07/19
Committee: JURI
Amendment 742 #
Proposal for a regulation
Article 19 – paragraph 7
7. The Commission, after consulting the Board, mayshall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 4a, 5 and 6.
2021/07/19
Committee: JURI
Amendment 744 #
Proposal for a regulation
Article 19 a (new)
Article 19a Accessibility requirements for online platforms 1. Providers of online platforms which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882. 2. Providers of online platforms shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in written and oral format, including in a manner which is accessible to persons with disabilities. Providers of online platforms shall keep that information for as long as the service is in operation. 3. Providers of online platforms shall ensure that information, forms and measures provided pursuant to this Regulation are made available in a manner that they are easy to find and accessible to persons with disabilities. 4. Providers of online platforms which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services. 5. In the case of non-conformity, providers of online platforms shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. 6. Provider of online platforms shall, further to a reasoned request from a competent authority, provide it with all information necessary to demonstrate the conformity of the service with the applicable accessibility requirements. They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements. 7. Online platforms which are in conformity with harmonised standards or parts thereof the references of which have been published in the Official Journal of the European Union, shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those standards or parts thereof cover those requirements. 8. Online platforms which are inconformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements.
2021/07/19
Committee: JURI
Amendment 748 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content, or content that is in breach with their terms and conditions.
2021/07/19
Committee: JURI
Amendment 749 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
2021/07/19
Committee: JURI
Amendment 761 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
3a. Suspensions referred to in paragraphs 1 and 2 may be declared permanent where (a) compelling reasons of law or public policy, including ongoing criminal investigations, justify avoiding or postponing notice to the recipient; (b) the items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts; or (c) the items removed were related to content covered by [Directive 2011/93/EU updated reference] or [Directive (EU) 2017/541 XXX New Ref to TCO Regulation].
2021/07/19
Committee: JURI
Amendment 767 #
Proposal for a regulation
Article 21
Notification of suspicions of criminal 1. aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 2. identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.Article 21 deleted offences Where an online platform becomes Where the online platform cannot
2021/07/19
Committee: JURI
Amendment 772 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traderproviders of online marketplaces, where such an online platformmarketplace presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platformmarketplace itself or by a recipient of the service who is acting under its authority or control.
2021/07/08
Committee: IMCO
Amendment 774 #
Proposal for a regulation
Article 21 – paragraph 2 b (new)
2b. Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified.
2021/07/19
Committee: JURI
Amendment 775 #
Proposal for a regulation
Article 21 – paragraph 2 c (new)
2c. The Commission shall adopt an implementing act setting down a template for notifications under paragraph 1.
2021/07/19
Committee: JURI
Amendment 776 #
Proposal for a regulation
Article 21 – paragraph 2 d (new)
2d. Where a notification of suspicions of criminal offences includes information which may be seen as potential electronic information in criminal proceedings, Regulation XXX [E-evidence] shall apply.
2021/07/19
Committee: JURI
Amendment 777 #
Proposal for a regulation
Article 22 – title
Traceability of traders on online marketplaces
2021/07/19
Committee: JURI
Amendment 780 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, itProviders of online marketplaces shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of itstheir services, the online platformmarketplaces hasve obtained the following information:
2021/07/19
Committee: JURI
Amendment 783 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bankpayment account details of the trader, where the trader is a natural person;
2021/07/19
Committee: JURI
Amendment 784 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning ofestablished in the Union and carrying out the tasks in accordance with Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or [Article XX of the General Product Safety Regulation], or any relevant act of Union law; _________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
2021/07/19
Committee: JURI
Amendment 785 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including thoseor national law, in conformity with the Union law, including the EU Charter on Fundamental Rights, and the requirements set out in this Regulation.
2021/07/08
Committee: IMCO
Amendment 785 #
Proposal for a regulation
Article 22 – paragraph 1 a (new)
1a. Providers of online marketplaces shall require traders to provide the information referred to in points (a) and (e) immediately upon initial registration for its services. Traders shall be required to provide any supplementary material relating to the information requirements set out in Article 22(1) within a reasonable period, and prior to the use of the service and offering of products and services to consumer.
2021/07/19
Committee: JURI
Amendment 786 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platformproviders of online marketplaces shall, upon receiving that information and before allowing traders to use their services, mtake reasonable effortseffective steps that would reasonably be taken by a diligent operator in accordance with a high industry standard of professional diligence to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is accurate, current and reliable through the use of independent and reliable sources including of any freely accessible official online database or online interface made available by an authorised administrator or a Member States or the Union or through direct requests to the trader to provide supporting documents from reliable sources.
2021/07/19
Committee: JURI
Amendment 791 #
Proposal for a regulation
Article 22 – paragraph 3 – introductory part
3. Where the online platformproviders of online marketplaces obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platformonline marketplace shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/07/19
Committee: JURI
Amendment 792 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Where the trader fails to correct or complete that information, the online platformproviders of online marketplaces shall suspend the provision of its service to the trader in relations to the offering of products or services to consumers located in the Union until the request is fully complied with.
2021/07/19
Committee: JURI
Amendment 795 #
Proposal for a regulation
Article 22 – paragraph 3 a (new)
3a. The providers of online marketplaces shall ensure that traders are given the ability to discuss any information viewed as inaccurate or incomplete directly with a trader before any suspension of services. This may take the form of the internal complaint- handling system under Article 17.
2021/07/19
Committee: JURI
Amendment 796 #
Proposal for a regulation
Article 22 – paragraph 3 b (new)
3b. If an online marketplaces rejects an application for services or suspends services to a trader, the trader shall have recourse to the systems under Article 17 and Article 43 of this Regulation.
2021/07/19
Committee: JURI
Amendment 797 #
Proposal for a regulation
Article 22 – paragraph 3 c (new)
3c. Traders shall be solely liable for the accuracy the information provided and shall inform without delay the online marketplace of any changes to the information provided.
2021/07/19
Committee: JURI
Amendment 799 #
Proposal for a regulation
Article 22 – paragraph 4
4. The online platformproviders of online market places shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information.
2021/07/19
Committee: JURI
Amendment 802 #
Proposal for a regulation
Article 22 – paragraph 5
5. Without prejudice to paragraph 2, the platformroviders of online marketplaces shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation.
2021/07/19
Committee: JURI
Amendment 803 #
Proposal for a regulation
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union lawor national law, that is in conformity with Union law, including the EU Charter on Fundamental Rights, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
2021/07/08
Committee: IMCO
Amendment 805 #
Proposal for a regulation
Article 22 – paragraph 6
6. The online platformproviders of online marketplaces shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner.
2021/07/19
Committee: JURI
Amendment 807 #
Proposal for a regulation
Article 22 – paragraph 7
7. The online platform shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law.deleted
2021/07/19
Committee: JURI
Amendment 810 #
Proposal for a regulation
Article 22 a (new)
Article 22a Additional provisions for online marketplaces related to illegal offers 1. The providers of online marketplaces shall take adequate measures in order to prevent the dissemination by traders using their service for offers for a product or a service, which do not comply with Union law. 2. Where the providers of online marketplaces obtain indication including on the elements listed in points (a) and (b) of paragraph 2 of Article 14, and according to which an item of information referred to in Article 22a is inaccurate, the providers of online marketplaces shall request the trader to give evidence of the accuracy of that item of information or to correct it, without delay. Where traders fail to comply with such request, the providers of online marketplaces shall suspend traders’ offer pending compliance with the request. 3. Before the trader's offer is made available on the online marketplaces, the providers of online marketplaces shall make their best efforts to assess, whether traders have provided the information referred to in paragraphs 1 and 2 of Article 22a, and whether the offer to consumers located in the Union is on the list, or the lists, of products or categories of products classified as non-compliant, according to any freely accessible official online database or online interface, or through direct requests to the trader to provide supporting documents from reliable sources . The providers of online marketplaces shall not authorise the trader to provide the offer online in case of non-compliance. 4. Where notified by market surveillance or customs authorities about the illegality of traders offer according to applicable law on product safety, the providers of online marketplaces shall remove the offers or disable access to them without delay and inform the respective traders and competent authorities. 5. The providers of online marketplaces shall demonstrate their best efforts to take effective and proportionate measures to prevent offers of counterfeit products as well as to prevent the reappearance of offers of previously notified and removed counterfeit products. To that end, providers of online marketplaces shall take into account the information received in accordance with Article 14 in the context of any content moderation system aiming at preventing reappearance, detecting, identifying, removing or disabling access to dangerous products offered on their marketplace. The measures referred to in this paragraph shall not lead to general monitoring as provided for in Article 7. 6. The providers of online marketplaces shall suspend without undue delay the provision of their services to traders that provide repeatedly illegal offers for a product or a service. They shall notify immediately its decision to the trader and competent authorities. 7. Where the providers of online marketplaces become aware, irrespective of the means used to, of the illegal nature of a product or service offered through their services, they shall inform without undue delay the recipients of the service that had acquired such product or contracted such services, about the illegality, the identity of the trader and any means of redress. Where the provider of the online marketplace does not have the contact details of the recipients of the service, the provider shall make publicly available and easily accessible on their online interface the information concerning the illegal products or services removed, the identity of the trader and any means of redress. 8. The providers of online marketplaces shall be entitled to right to redress towards the traders failing to comply with their obligations towards the online marketplaces or consumers. Consumers shall be entitled to right to redress towards the providers of online marketplaces for the failure of the latter to comply with the obligations under Articles 22, 22a and 22 b.
2021/07/19
Committee: JURI
Amendment 812 #
Proposal for a regulation
Article 22 b (new)
Article 22b Additional provisions for online marketplaces related to illegal offers 1. The providers of online marketplaces shall take adequate measures in order to prevent the dissemination by traders using their service for offers for a product or a service, which do not comply with Union law. 2. Where the providers of online marketplaces obtain indication including on the elements listed in points (a) and (b) of paragraph 2 of Article 14, and according to which an item of information referred to in Article 22a is inaccurate, the providers of online marketplaces shall request the trader to give evidence of the accuracy of that item of information or to correct it, without delay. Where traders fail to comply with such request, the providers of online marketplaces shall suspend traders’ offer pending compliance with the request. 3. Before the trader's offer is made available on the online marketplaces, the providers of online marketplaces shall make their best efforts to assess, whether traders have provided the information referred to in paragraphs 1 and 2 of Article 22a, and whether the offer to consumers located in the Union is on the list, or the lists, of products or categories of products classified as non-compliant, according to any freely accessible official online database or online interface, or through direct requests to the trader to provide supporting documents from reliable sources . The providers of online marketplaces shall not authorise the trader to provide the offer online in case of non-compliance. 4. Where notified by market surveillance or customs authorities about the illegality of traders offer according to applicable law on product safety, the providers of online marketplaces shall remove the offers or disable access to them without delay and inform the respective traders and competent authorities. 5. The providers of online marketplaces shall demonstrate their best efforts to take effective and proportionate measures to prevent offers of counterfeit products as well as to prevent the reappearance of offers of previously notified and removed counterfeit products. To that end, providers of online marketplaces shall take into account the information received in accordance with Article 14 in the context of any content moderation system aiming at preventing reappearance, detecting, identifying, removing or disabling access to dangerous products offered on their marketplace. The measures referred to in this paragraph shall not lead to general monitoring as provided for in Article 7. 6. The providers of online marketplaces shall suspend without undue delay the provision of their services to traders that provide repeatedly illegal offers for a product or a service. They shall notify immediately its decision to the trader and competent authorities. 7. Where the providers of online marketplaces become aware, irrespective of the means used to, of the illegal nature of a product or service offered through their services, they shall inform without undue delay the recipients of the service that had acquired such product or contracted such services, about the illegality, the identity of the trader and any means of redress. Where the provider of the online marketplace does not have the contact details of the recipients of the service, the provider shall make publicly available and easily accessible on their online interface the information concerning the illegal products or services removed, the identity of the trader and any means of redress. 8. The providers of online marketplaces shall be entitled to right to redress towards the traders failing to comply with their obligations towards the online marketplaces or consumers. Consumers shall be entitled to right to redress towards the providers of online marketplaces for the failure of the latter to comply with the obligations under Articles 22, 22a and 22 b.
2021/07/19
Committee: JURI
Amendment 816 #
Proposal for a regulation
Article 23 – paragraph 1 – point c a (new)
(ca) the number of advertisements that were removed, labelled or disabled by the online platform and justification of the decisions;
2021/07/19
Committee: JURI
Amendment 819 #
Proposal for a regulation
Article 23 – paragraph 4
4. The Commission mayshall adopt implementing acts to establish a set of Key Performance Indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
2021/07/19
Committee: JURI
Amendment 821 #
Proposal for a regulation
Article 23 a (new)
Article 23a Online advertising and recommender systems 1. Online platforms that directly or indirectly display advertising to the recipients of the service or use recommender systems shall not use inferred data resulting from the profiling of the recipients or any personal data collected about them in services provided by third parties. Online platforms may use personal data explicitly provided or declared by the recipients, provided that they have been granted consent within the meaning of Article 4(11) of Regulation (EU) 2016/679. Online platforms shall ensure that the option that does not require the use of personal data is activated by default and that users of the service have the option to opt-in of personalised advertisements or recommendations. 2. Online platforms shall, where applicable, provide an easily available functionality on their online interface allowing the recipients of the service, at any time, to: (a) declare, modify and delete their personal data referred to in paragraph 1; (b) modify and delete any categories used by the platform to categorise the content of advertisements or recommendations. (c) exclude data collected from other related products/services or from previous engagement with certain content, pages, or users. Article 12b (4) applies accordingly to the communication of consent referred to in paragraph 1 and choices made by the recipient of the service mentioned above. 3. Online platforms that use recommender systems and systems for selecting and displaying advertisements, shall set out in an easily accessible place in their online interface, such as in their terms and conditions and separately at the moment the advertisement or the content recommendation to recipients of the service takes place, in a clear, accessible and easily comprehensible manner, relevant information on the functioning of these systems, in particular their parameters, and ensure that significant changes to the information provided on their online interfaces is traceable over time. 4. Online platforms shall set out in terms and conditions relevant information as to how the company may interfere with the regular operation and optimization goal of the recommender system and ensure that significant changes to the information provided on the site is traceable over time. 5. Pursuant to the transparency reporting obligations of articles 13, 23, and 44, online platforms shall provide in a clear, accessible and easily comprehensible manner, transparency as to the trust and safety operations addressed to recommender systems. This transparency shall include, at a minimum: (a) Comprehensive definitions of content that platforms apply specific content moderation measures to and information about specific content moderation practices that are applied to such content. (b) Aggregate data that accounts for the total views and view rate of content that was subsequently removed pursuant to orders issued in accordance with Articles 8 and 9 or on the basis of content moderation engaged in at the provider’s own initiative; (c) Aggregate data on the relative share of violative content that compared to the total volume of content on the service and/or overall amount of such content (d) Aggregate data on reach and recommendation of as well as engagement with violative content (e) Aggregate data on how long after being uploaded violative content was de- amplified or down-ranked 6. The parameters referred to in paragraph 3 shall include, at a minimum: (a) the criteria used by relevant systems, (b) the indication of the importance that specific criteria have for outputs produced by relevant systems, (c) the optimisation goals of relevant systems, (d) if applicable, a list of categories of personal data taken into account by relevant systems, sources of this data, and an explanation of the role that the behaviour of the recipients of the service plays in how relevant systems produce their outputs, (e) in the case of very large online platforms, the summary of risk assessments referred to in Article 26 and the description of mitigation measures referred to in Article 27.
2021/07/19
Committee: JURI
Amendment 823 #
Proposal for a regulation
Article 24 – title
Online advertising transparency and control
2021/07/19
Committee: JURI
Amendment 824 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Online platforms that directly and indirectly display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, meaningful, salient, uniform and unambiguous manner and in real time:
2021/07/19
Committee: JURI
Amendment 825 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
(a) that the information displayed is an advertisementon the interface or parts thereof is an online advertisement, including through prominent and harmonised marking;
2021/07/19
Committee: JURI
Amendment 828 #
(c) clear, meaningful and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed. and the logic involved;
2021/07/19
Committee: JURI
Amendment 829 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
(ca) whether the advertisement was selected using an automated mechanism. such as ad exchange mechanisms, and if so, the identity of the natural or legal person responsible for the system;
2021/07/19
Committee: JURI
Amendment 830 #
Proposal for a regulation
Article 24 – paragraph 1 – point c b (new)
(cb) if the online platform uses automated systems to determine the recipients of the service to whom the advertisement should be displayed, meaningful information about the reasons why a given advertisement has been deemed relevant for a specific recipient of the service.
2021/07/19
Committee: JURI
Amendment 831 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
The online platform shall design and organise its online interface in such a way that recipients of the service can easily and efficiently exercise their rights under applicable Union law in relation to the processing of their personal data for each specific advertisement displayed to the data subject on the platform, in particular: (a) to withdraw consent or to object to processing; (b) to obtain access to the personal data concerning the data subject; (c) to obtain rectification of inaccurate personal data concerning the data subject; (d) to obtain erasure of personal data without undue delay; Where a recipient exercises any of these rights, the online platform must inform any parties to whom the personal data concerned in points (a)-(d) have been enclosed in accordance with Article 19 of Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 833 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Online platforms that suggest advertised content to which the recipients of the service have not explicitly looked for or subscribed to shall ensure that the recipients of the service can identify, for each specific suggestion, in a clear and unambiguous manner and in real time, meaningful information about the criteria used to suggest this content to the recipient, including, where applicable, personal data of the recipient taken into account pursuant to Article XY.
2021/07/19
Committee: JURI
Amendment 834 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Providers of online platforms shall, by default, not make the recipients of their service subject to behavioural and micro- targeted advertisements unless the recipients of the service has expressed a freely given, specific, informed and unambiguous consent in the line with the requirements under Regulation (EU) 2016/679 and article 12(2b). Providers of online platforms shall ensure this requirements applies to previous choices expressed by individual recipients of the service.
2021/07/19
Committee: JURI
Amendment 835 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider.
2021/07/08
Committee: IMCO
Amendment 835 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Where a recipient exercises any of the rights referred to points (a), (c) or (d) in paragraph 2, the online platform must immediately cease displaying advertisements using the personal data concerned or using parameters which were set using this data.
2021/07/19
Committee: JURI
Amendment 836 #
Proposal for a regulation
Article 24 – paragraph 1 c (new)
Online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request and obtain information on where their advertisements have been placed; (b) can request and obtain information on which broker treated their data; (c) can indicate on which specific location their ads cannot be placed. In case of non-compliance with this provision, advertisers shall have the right to judicial redress.
2021/07/19
Committee: JURI
Amendment 837 #
Proposal for a regulation
Article 24 – paragraph 1 c (new)
Providers of online platforms shall provide individual recipients of the service the possibility to modify or influence the parameters used to display advertisements to the recipient of the service. The default parameters shall be the most respectful and protective possible towards the rights of consumers.
2021/07/19
Committee: JURI
Amendment 838 #
Proposal for a regulation
Article 24 – paragraph 1 d (new)
Online platforms shall also build special protections for individual recipients of the service below the age of 16 to limit their exposure to advertising. Advertisements that are targeted or micro targeted toward individuals or segments of individuals who are below the age of 18 on the basis of their personal data, behaviour, the tracking of their activities or profiling within the meaning of Article 4(4) of Regulation (EU) 2016/679 shall not be permitted.
2021/07/19
Committee: JURI
Amendment 839 #
Proposal for a regulation
Article 24 – paragraph 1 d (new)
The Commission shall adopt an implementing act establishing harmonised specifications for the marking referred to in paragraph 1(a) of this Article.
2021/07/19
Committee: JURI
Amendment 842 #
Proposal for a regulation
Chapter III – Section 4 – title
4 Additional obligations for very large online platforms, live streaming platforms, private messaging providers and search engines to manage systemic risks
2021/07/19
Committee: JURI
Amendment 843 #
Proposal for a regulation
Article 25 – title
Very large online platforms, live streaming platforms, private messaging providers and search engines
2021/07/19
Committee: JURI
Amendment 844 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platform services, live streaming platform services, private messaging services and search engine services which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
2021/07/19
Committee: JURI
Amendment 845 #
Proposal for a regulation
Article 8 – paragraph 3
3. The Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the orders referred to in paragraph 1 to all other Digital Services Coordinators through the system established in accordance with Article 67. Where upon receiving the copy of the order, at least three Digital Services Coordinators consider that the order violates Union or national law, that is in conformity with the Union Law, including the Charter, they can object the enforcement of the order to the Board, based on a reasoned statement. Following recommendation of the Board, the Commission may decide whether the order shall be enforced. Where the order to act against a specific item of illegal content under Union or national law has been issued by the national judicial or administrative authority of a Member State that is under an Article 7 procedure for infringement of European values according to Article 2 of TEU, any Digital Service Coordinator may object the order directly to the Commission. The Commission shall assess the objection to the order as a matter of priority and decide whether the order should be enforced as swiftly as possible and no later than 48 hours upon receipt of the objection.
2021/07/08
Committee: IMCO
Amendment 850 #
Proposal for a regulation
Article 8 – paragraph 4
4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law, including the EU Charter on Fundamental Rights.
2021/07/08
Committee: IMCO
Amendment 854 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), on an ongoing basis and at least once a year thereafter, any significthe probability and severity of anty systemic risks stemming from the design, intrensic characteristics, functioning and use made of their services in the Union. The risk assessment shall be broken down per Member State in which services are offered and in the Union as a whole. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/19
Committee: JURI
Amendment 856 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/19
Committee: JURI
Amendment 858 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. Where no effect has been given to the order, providers of intermediary services shall provide without delay the authority of issuing the order with a statement of reasons as to why the order was not given an effect.
2021/07/08
Committee: IMCO
Amendment 859 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a) the dissemination of illegal content through their serviand content that is in breach of their terms and conditions through their services, including unsafe and non- compliant products and services, in case of online marketplaces;
2021/07/19
Committee: JURI
Amendment 860 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a) the dissemination of illegal content and content that is in breach of their terms and conditions through their services;,
2021/07/19
Committee: JURI
Amendment 863 #
Proposal for a regulation
Article 26 – paragraph 1 – point a a (new)
(aa) the funding of illegal content, including models based on advertisement;
2021/07/19
Committee: JURI
Amendment 864 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/19
Committee: JURI
Amendment 869 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service and amplification of content that is in breach of their terms and conditions, including by means of inauthentic use, such as ‘deep fakes’ or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, democratic values, media freedom and freedom of expression of journalists, as well as their ability to verify facts, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/19
Committee: JURI
Amendment 872 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective foraccording to which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
2021/07/08
Committee: IMCO
Amendment 877 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
2a. When conducting risk assessments, very large online platforms shall involve representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. Their involvement shall be tailored to the specific systemic risks that the very large online platform aim to assess.
2021/07/19
Committee: JURI
Amendment 878 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10, or in the official language of the Member State that issues the order against the specific item of illegal content. In such case, the point of contact may request the competent authority to provide translation into the language declared by the provider;
2021/07/08
Committee: IMCO
Amendment 881 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measureseasures to mitigate the probability and severity of any, tailored to address the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/07/19
Committee: JURI
Amendment 882 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/19
Committee: JURI
Amendment 885 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, design, the features or functioning of their services, or their terms and conditions;
2021/07/19
Committee: JURI
Amendment 889 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of and targeting of advertisements in association with the service they provide;
2021/07/19
Committee: JURI
Amendment 897 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, to the Board in view of issuing specific recommendations and to independent auditors for the purposes of the audit report. Following the written explanation of the reasons of the very large online platforms not to put in place mitigating measures, and where necessary, the Board shall issue specific recommendations as to the mitigation measures that very large online platforms shall implement. Very large online platforms shall within one month from receiving of these recommendations, implement the recommended measures, or set out any alternative measures they intend to take to address the identified risks. In case of systemic failure of a very large online platform to take effective mitigating measures and in case of repeated non-compliance with the recommendations, the Board may advise the Commission and the Digital Services Coordinators to impose sanctions.
2021/07/19
Committee: JURI
Amendment 898 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in article 27.1, it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in article 28.3.
2021/07/19
Committee: JURI
Amendment 899 #
Proposal for a regulation
Article 27 – paragraph 1 b (new)
1b. The Board shall evaluate the implementation and effectiveness of mitigating measures undertaken by very large online platforms listed in Article 27(1) and where necessary, may issue recommendations.
2021/07/19
Committee: JURI
Amendment 900 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which. The reports of the Board shall be broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. The reports shall include the following:
2021/07/19
Committee: JURI
Amendment 903 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrenteach of the systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
2021/07/19
Committee: JURI
Amendment 911 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
2021/07/19
Committee: JURI
Amendment 915 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III, in particular the quality of the identification, analysis and assessment of the risks referred to in Article26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27;
2021/07/19
Committee: JURI
Amendment 919 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which have been selected by the Commission and:
2021/07/19
Committee: JURI
Amendment 920 #
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
(ca) have been certified by the Commission for the performance of this task;
2021/07/19
Committee: JURI
Amendment 921 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
(fa) a description of specific elements that could not be audited, and an explanation of why these could not be audited;
2021/07/19
Committee: JURI
Amendment 922 #
Proposal for a regulation
Article 28 – paragraph 3 – point f b (new)
(fb) where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such conclusion.
2021/07/19
Committee: JURI
Amendment 929 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions and on a designated web page that can be directly reached from the very large online platforms’ online interface, in a clear, accessible and easily comprehensible manner for the general public, the main parameters used in their recommender systems, the optimisation goals of their recommender systems as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 933 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and ensure that their terms and conditions prevent the recipients of their services from providing information that is not compliant with Union law or the law of the Member State where the information is provided. Any additional restrictions that providers of intermediary services may impose in relation to the use of their service and the information provided by the recipients of the service shall be in full compliance with the fundambiguous language and shall be publicly available in an easily accessible formatental rights of the recipients of the services as enshrined in the EU Charter on Fundamental Rights.
2021/07/08
Committee: IMCO
Amendment 935 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, and available remedies including applicable alternative dispute resolution mechanisms. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions, including information on the available remedies and the possibilities for opt-out, where relevant.
2021/07/08
Committee: IMCO
Amendment 941 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall actpply and enforce the restrictions referred to in paragraph 2 in a diligent, objective and, timely, proportionate manner in applying and enforcing the restrictions referred to in paragraph 1and non- discriminatory manner, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charternational and Union law, including the EU Charter on Fundamental Rights.
2021/07/08
Committee: IMCO
Amendment 944 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2a. Online platforms shall ensure that their online interface is designed in such a way that it does not risk misleading or manipulating the recipients of the service.
2021/07/19
Committee: JURI
Amendment 945 #
Proposal for a regulation
Article 30 – title
Additional online advertising transparency and protection
2021/07/19
Committee: JURI
Amendment 947 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available to relevant authorities and vetted researchers, meeting the requirements of Article 31(4), through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/19
Committee: JURI
Amendment 950 #
Proposal for a regulation
Article 30 – paragraph 2 – point a
(a) the content of the advertisement, including the name of the product, service or brand and the object of the advertisement;
2021/07/19
Committee: JURI
Amendment 952 #
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
(ba) the natural or legal person or group who paid for the advertisement;
2021/07/19
Committee: JURI
Amendment 958 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. The Board shall, after consulting trusted flaggers and vetted researchers, publish guidelines on the structure and organisation on repositories created pursuant to paragraph 1.
2021/07/19
Committee: JURI
Amendment 959 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2b. Very large online platforms selling advertising for display on their online interface shall ensure via standard contractual clauses with the purchasers of advertising space that the content with which the advertisement is associated is compliant with the terms and conditions of the platform, or with the law of the Member States where the recipients of the service to whom the advertisement will be displayed is located.
2021/07/19
Committee: JURI
Amendment 960 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2b. Very large online platforms shall be prohibited from profiling or targeting minors with personalised advertising, in compliance with the industry-standards laid down in Article 34 and Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 961 #
Proposal for a regulation
Article 30 – paragraph 2 c (new)
2c. Very large online platforms that display advertising on their online interfaces shall conduct at their own expense, and upon request of advertisers, independent audits performed by organisations complying with the criteria set out in Article 28(2). Such audits shall be based on fair and proportionate conditions agreed between platforms and advertisers, shall be conducted with a reasonable frequency and shall entail: (a) conducting quantitative and qualitative assessment of cases where advertising is associated with illegal content or with content incompatible with platforms’ terms and conditions; (b) monitoring for and detecting of fraudulent use of their services to fund illegal activities; (c) assessing the performance of their tools in terms of brand safety. The audit report shall include opinion on the performance of platforms’ tools in terms of brand safety. Where the audit opinion is not positive, the report shall make operational recommendations to the platforms on specific measures in order to achieve compliance. The platforms shall make available to advertisers, upon request, the results of such audit.
2021/07/19
Committee: JURI
Amendment 962 #
Proposal for a regulation
Article 30 – paragraph 2 c (new)
2c. Very large online platforms shall take adequate measures to detect inauthentic videos (‘deep fakes’). When detecting such videos, they should label them as inauthentic in a way that is clearly visible for the internet user.
2021/07/19
Committee: JURI
Amendment 963 #
Proposal for a regulation
Article 30 – paragraph 2 d (new)
2d. Very large online platforms shall offer users the opportunity to check if their username and password have been compromised in a data leak, such as through the pwned open source database.
2021/07/19
Committee: JURI
Amendment 964 #
Proposal for a regulation
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable periodout delay, specified in the request, full and continuous access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. With regard to moderation and recommender systems, very large online platforms shall provide upon request the Digital Services Coordinator or the Commission with access to algorithms and associated data that allow the detection of possible biases which could lead to the dissemination of illegal content, or content that is in breach with their terms and conditions, or presents threats to fundamental rights including freedom of expression. Where a bias is detected, very large online platforms should expeditiously correct it following the recommendations of the Digital Services Coordinator or the Commission. Very large online platforms should be able to demonstrate their compliance at every step of the process pursuant to this Article.
2021/07/19
Committee: JURI
Amendment 968 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment, three Digital Services Coordinators of destination or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding and mitigation of systemic risks as set out in Articles 26(1) and 27.
2021/07/19
Committee: JURI
Amendment 976 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily accessible, comprehensible, and detailed reports on any content moderation they engaged in during the relevant period. The reports shall be available in searchable archives. Those reports shall include, in particular, information on the following, as applicable:
2021/07/08
Committee: IMCO
Amendment 976 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, scientific researchers shall be affiliated with academic institutions, be independent from commercial interests and the very large online platform it seeks data from or its competitors, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
2021/07/19
Committee: JURI
Amendment 979 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons:for the following reasons: (a) in case of request under paragraph 1, a very large online platform does not have and cannot obtain with reasonable effort access to the data; (b) in case of request under paragraph 2, a very large online platform does not have access to the data or providing access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets.
2021/07/19
Committee: JURI
Amendment 983 #
Proposal for a regulation
Article 31 – paragraph 6 – point a
(a) it does not haven case of request under paragraph 1, a very large online platform does not have and cannot obtain with reasonable effort access to the data;
2021/07/19
Committee: JURI
Amendment 985 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
2021/07/08
Committee: IMCO
Amendment 985 #
Proposal for a regulation
Article 31 – paragraph 6 – point b
(b) givin case of request under paragraph 2, a very large online platform does not have access to the data or providing access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets.
2021/07/19
Committee: JURI
Amendment 988 #
7a. Upon completion of the research envisaged in Article 31(2), the vetted researchers shall make their research publicly available, taking into account the rights and interests of the recipients of the service concerned in compliance with Regulation (EU) 2019/679.
2021/07/19
Committee: JURI
Amendment 989 #
Proposal for a regulation
Article 31 – paragraph 7 b (new)
7b. Digital Service Coordinators and the Commission shall, once a year, report the following information: (a) the number of requests made to them as referred to in paragraphs 1 and 2; (b) the number of such requests that have been declined by the Digital Service Coordinator or the Commission and the reasons for which they have been declined; (c) the number of such requests that have been declined by the Digital Service Coordinator or the Commission, including the reasons for which they have been declined, following a request to the Digital Service Coordinator or the Commission from a very large online platform to amend a request as referred to in paragraphs 1 and 2.
2021/07/19
Committee: JURI
Amendment 1000 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1a. The information provided shall be broken down per Member State in which services are offered and in the Union as a whole.
2021/07/08
Committee: IMCO
Amendment 1003 #
Proposal for a regulation
Article 33 – paragraph 2 a (new)
2a. The reports shall include content moderation broken down per Member State in which the services are offered and in the Union as a whole and shall be published in the official languages of the Member States of the Union.
2021/07/19
Committee: JURI
Amendment 1007 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 and 1a shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/07/08
Committee: IMCO
Amendment 1010 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1a. The Commission shall support and promote the development and implementation of standards set by relevant European and international standardisation bodies, subject to transparent, multi-stakeholder and inclusive processes in line with Regulation (EU) 1025/2012, for the protection and promotion of the rights of the child, observance of which, once adopted will be mandatory for very large online platforms, at least for the following: (a) Age assurance and age verification; (b) Child impact assessments; (c) Child-centred and age-appropriate design; (d) Child-centred and age-appropriate terms and conditions.
2021/07/19
Committee: JURI
Amendment 1013 #
Proposal for a regulation
Article 35 – paragraph 1
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content as defined in Union and national law and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
2021/07/19
Committee: JURI
Amendment 1020 #
Proposal for a regulation
Chapter III – Section 2 – title
Additional provisions applicable to providers of hosting services, including online platforms and to providers of live streaming platform services and of private messaging services
2021/07/08
Committee: IMCO
Amendment 1021 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission mayshall invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/19
Committee: JURI
Amendment 1022 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services, providers of live streaming platform services and of private messaging services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content, or content that is in breach with their terms and conditions. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means and may include: (a) a clearly identifiable banner or single reporting button, allowing users to notify quickly and easily the providers of these services of illegal content they have encountered; (b) providing information to the users on what is considered illegal content under Union and national law; (c) providing information to the users on available national public tools to signal illegal content to the competent authorities.
2021/07/08
Committee: IMCO
Amendment 1023 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain a set of harmonised key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. in order to facilitate effective cross-platform monitoring
2021/07/19
Committee: JURI
Amendment 1026 #
Proposal for a regulation
Article 35 – paragraph 4
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusion, and publish their conclusions. Furthermore, they shall ensure that there is common alert mechanism managed at EU level to allow for real-time and coordinated responses.
2021/07/19
Committee: JURI
Amendment 1028 #
Proposal for a regulation
Article 35 – paragraph 5
5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic and repetitive failure to comply with the Codes of Conduct, the Board shall as a measure of last resort take a decision to temporary suspend or definitely exclude platforms that do not meet their commitments as a signatory to the Codes of Conduct.
2021/07/19
Committee: JURI
Amendment 1033 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality or the breach of the content in question with the terms and conditions. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
2021/07/08
Committee: IMCO
Amendment 1038 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content, or content that is in breach with providers' terms and conditions;
2021/07/08
Committee: IMCO
Amendment 1038 #
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
(ba) the setting-up of unique identifier that will enable advertisers and publishers to identify and track a campaign throughout its lifecycle.
2021/07/19
Committee: JURI
Amendment 1040 #
Proposal for a regulation
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall evaluate the application of those Codes two years after the application of this Regulation.
2021/07/19
Committee: JURI
Amendment 1041 #
Proposal for a regulation
Article 36 – paragraph 3 a (new)
3a. The Commission shall encourage all the players in the online advertising value chain to endorse and comply with the commitments stated in the codes of conduct.
2021/07/19
Committee: JURI
Amendment 1042 #
Proposal for a regulation
Article 36 a (new)
Article 36a Codes of conduct for the protection of minors 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms and other relevant services providers and organisations representing minors, parents and civil society organisations or relevant authorities to further contribute to the protection of minors on online. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective protection of minors online, which respects their right as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment. The Commission shall aim to ensure that the codes of conduct address at least: (a) Age verification and age assurance models, taking into account the industry standards referred to in article 34. (b) Child-centred and age-appropriate design, taking into account the industry standards referred to in article 34. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.
2021/07/19
Committee: JURI
Amendment 1043 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content, or content that is in breach with providers' terms and conditions;
2021/07/08
Committee: IMCO
Amendment 1053 #
Proposal for a regulation
Article 40 – paragraph 1
1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation and final jurisdiction as to disputes on orders issued under Article 8 and 9.
2021/07/19
Committee: JURI
Amendment 1054 #
Proposal for a regulation
Article 40 – paragraph 1 a (new)
1a. By way of derogation from paragraph 1, the Member State in which the consumers have their residence shall have jurisdiction for the purposes of Articles 22, 22a and 22b of this Regulation and the Member State in which the authority issuing the order is situated shall have jurisdiction for the purposes of Articles 8 and 9 of this Regulation.
2021/07/19
Committee: JURI
Amendment 1056 #
Proposal for a regulation
Article 40 – paragraph 4
4. Paragraphs 1, 1a, 2 and 3 are without prejudice to Article 43(2), the second subparagraph of Article 50(4) and the second subparagraph of Article 51(2) and the tasks and powers of the Commission under Section 3.
2021/07/19
Committee: JURI
Amendment 1059 #
Proposal for a regulation
Article 41 – paragraph 3 a (new)
3a. Following request to the Commission and in cases of infringements that persist, could cause serious harm to recipients of the service, or could seriously affect their fundamental rights, the Digital Services Coordinator of the country of destination may be entitled to additional powers in the framework of joint investigations as referred to in Article 46.
2021/07/19
Committee: JURI
Amendment 1072 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Where the complaint concerns an alleged harm upon the recipients of the service, the Member State where the recipient resides shall have jurisdiction for the purposes of the complaint.
2021/07/19
Committee: JURI
Amendment 1074 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services, of live streaming platform services and of private messaging services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, or in respect of the recipient of the service who provided this information, in a timely, diligent non-discriminatory and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
2021/07/08
Committee: IMCO
Amendment 1074 #
Proposal for a regulation
Article 43 a (new)
Article 43a Rights to effective judicial remedies 1. Without prejudice to any available administrative or non-judicial remedy, any recipient of the service or representative organisations shall have the right to an effective judicial remedy where he or she suffered harm as a result of an infringement of Articles 26(1) and 27(1). 2. In determining whether the very large online platform has complied with its obligations under Article 27(1), and in light of the principle of proportionality, the availability of suitable and effective measures shall be taken into account. 3. Such proceedings may be brought before the courts of the Member State where the recipient of the service has his or her habitual residence. 4. Without prejudice to any other administrative or non-judicial remedy, any recipients of the service or representative organisations shall have the right to an effective judicial remedy where the Digital Service Coordinator which is competent pursuant to Articles 40 and 43 does not handle a complaint or does not inform the recipient of the service within three months on the progress or outcome of the complaint lodged pursuant to Article 43. Proceedings against a Digital Services Coordinator under paragraph 4 shall be brought before the courts of the Member State where the Digital Services Coordinator is established.
2021/07/19
Committee: JURI
Amendment 1080 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Providers of hosting services, of live streaming platform services and of private messaging services shall demonstrate their best efforts to prevent from reappearing content which is identical to another piece of content that has already been identified and removed by them as illegal. The application of this requirement shall not lead to any general monitoring obligation.
2021/07/08
Committee: IMCO
Amendment 1080 #
Proposal for a regulation
Article 45 – paragraph 1 – subparagraph 1
Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may recommendshall request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
2021/07/19
Committee: JURI
Amendment 1083 #
Proposal for a regulation
Article 45 – paragraph 2 – introductory part
2. A request or recommendation pursuant to paragraph 1 shall at least indicate:
2021/07/19
Committee: JURI
Amendment 1086 #
Proposal for a regulation
Article 45 – paragraph 3
3. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1 and assess the matter in view of taking specific investigatory or enforcement measures to ensure compliance without undue delay. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, and tor the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided.
2021/07/19
Committee: JURI
Amendment 1088 #
Proposal for a regulation
Article 45 – paragraph 4
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of anythe result of the investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation. The Digital Services Coordinator shall at least conduct a preliminary assessment of the issue raised.
2021/07/19
Committee: JURI
Amendment 1089 #
Proposal for a regulation
Article 45 – paragraph 4
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto and a statement of reason in case of decision, following its investigation, not to take measures to ensure compliance with this Regulation.
2021/07/19
Committee: JURI
Amendment 1092 #
Proposal for a regulation
Article 45 – paragraph 5
5. Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the Commission and the Digital Services Coordinators, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4.
2021/07/19
Committee: JURI
Amendment 1093 #
Proposal for a regulation
Article 45 – paragraph 6
6. The Commission , in cooperation with the Digital Services Coordinators shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Board.
2021/07/19
Committee: JURI
Amendment 1094 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or, disable access to or otherwise restrict the visibility of specific items of information provided by the recipients of the service or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or, removing or disabling access to or reducing the visibility of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access or the restriction of visibility or the suspension or termination of monetization, of the decision and provide a clear and specific statement of reasons for that decision.
2021/07/08
Committee: IMCO
Amendment 1095 #
Proposal for a regulation
Article 45 – paragraph 7
7. 7. Where, pursuant to paragraph 6, the Commission in cooperation with the Digital Services Coordinators concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. Where the Digital Services Coordinator of establishment fails to comply with the request to take the necessary measures before the end of the two months period, the Commission shall reallocate the case without delay to the Digital Services Coordinator initiating the request.
2021/07/19
Committee: JURI
Amendment 1098 #
Proposal for a regulation
Article 46 – title
Joint investigations, cooperation among Digital Services Coordinators and requests for Commission intervention
2021/07/19
Committee: JURI
Amendment 1099 #
Proposal for a regulation
Article 15 – paragraph 1 a (new)
1a. When the removing or disabling access to specific items of information is followed by the transmission of these specific items of information in accordance with Article 15a, the requirement to inform the recipient set out in paragraph 1 may be postponed by a period of six weeks in order to avoid interfere with potential ongoing criminal investigations. The period of six weeks can be renewed only following a motivated decision of the competent authority to which the specific items of information had been transmitted.
2021/07/08
Committee: IMCO
Amendment 1099 #
Proposal for a regulation
Article 46 – paragraph 1 a (new)
1a. Where Digital Services Coordinator of the country of destination considers that an alleged infringement exist and causes serious harm to a large number of recipients of the service in that Member States, or could seriously affect their fundamental rights, it may request to the Commission to set up joint investigations between Digital Services Coordinator of country of establishment and the requesting Digital Services Coordinator of country of destination.
2021/07/19
Committee: JURI
Amendment 1100 #
1b. The Commission, in cooperation with the Digital Services Coordinators, shall assess such request and following positive opinion of the Board shall set up a joint investigation where the Digital Services Coordinator of the country of destination can be entitled to exercise the following additional powers with respect to the provider of intermediary services concerned by the alleged infringement: (a) to obtain access to the confidential version of the reports published by the intermediary service providers referred to in Article13 and where applicable in Articles 23 and 24, as well as to the annual reports drawn up by the other competent authorities pursuant to Article 44; (b) to obtain access to data collected by the Digital Services Coordinator of the country of establishment for the purpose of supervision of that provider on the territory of the Digital Services Coordinator of the country of destination; (c) to initiate proceedings and assess the matter in view of taking specific investigatory or enforcement measures to ensure compliance, where the suspected seriousness of the infringement would require immediate response that would not allow for the provisions of Article 45 to apply; (d) to request interim measures, as referred to in Article 41(2)(e);
2021/07/19
Committee: JURI
Amendment 1101 #
1c. The Commission decision setting up the joint investigation shall define a deadline by when Digital Services Coordinator of the country of establishment and Digital Services Coordinator launching the request pursuant to paragraph 2 shall agree on a common position on the joint investigation, and where applicable on the enforcement measures to be adopted. If no agreement is reached within this deadline, the case shall be referred to the Commission pursuant to Article 45(5).
2021/07/19
Committee: JURI
Amendment 1105 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the restriction of the visibility of, or the demonetisation of, the information and, where relevant, the territorial scope of the disabling of access or the restriction;
2021/07/08
Committee: IMCO
Amendment 1105 #
Proposal for a regulation
Article 49 – paragraph 1 – point c a (new)
(ca) issue specific recommendations for the implementation of Article 27 and advise on possible application of sanctions in case of repeated non-compliance;
2021/07/19
Committee: JURI
Amendment 1112 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/19
Committee: JURI
Amendment 1114 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period.
2021/07/19
Committee: JURI
Amendment 1123 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/19
Committee: JURI
Amendment 1124 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/19
Committee: JURI
Amendment 1127 #
Proposal for a regulation
Article 15 a (new)
Article 15a Preservation of content and related data, and mandatory transmission of specific items of information 1. Providers of hosting services shall store the illegal content which has been removed or access to which has been disabled as a result of content moderation, or of an order to act against a specific item of illegal content as referred to in Article 8, as well as any related data removed as a consequence of the removal of such illegal content, which are necessary for administrative or judicial review proceedings, including or out-of- court dispute settlement against a decision to remove or disable access to illegal content and related data. 2. The illegal content and related data, as referred to in paragraph 1, shall be stored for six months from the date of removal or disabling. The illegal content shall, upon request from the competent authority or court, be preserved for a further specified period only if and for as long as necessary for ongoing administrative or judicial review proceedings, as referred to in paragraph 1. 3. Providers of hosting services shall ensure that the illegal content and related data stored pursuant to paragraph 1 are subject to appropriate technical and organisational safeguards. Those technical and organisational safeguards shall ensure that the illegal content and related data stored are accessed and processed only for the purposes referred to in paragraph 1, and ensure a high level of security of the personal data concerned. Providers of hosting services shall review and update those safeguards where necessary. 4. Providers of hosting services shall transmit to the competent authorities of the Member States the illegal content which has been removed or access to which has been disabled, whether such removing or disabling access is a result of a voluntary content moderation or of a use of the notification and action mechanism referred to in Article 14. This obligation of transmission applies under the following conditions: (a) illegal content referred to in this paragraph means content which is manifestly illegal and is an offense according to Framework Decision2008/913/JHA and Directive 2011/36/EU; and (b) the competent law enforcement authority to which to transmit such illegal content is that of the Member State of the residence or establishment of the person who made the illegal content available, or, failing that, the law enforcement authority of the Member State in which the provider of hosting services is established or has its legal representative; or, failing that, the provider of hosting services shall inform Europol; (c) when the provider of hosting services is a very large online platform in accordance with Section 4 of Chapter III, it must also, when transmitting the illegal content, add an indicating flag for the illegal content which involve a threat to the life or safety of persons. 5. Each Member State shall notify to the Commission the list of its competent law enforcement authorities as referred to in paragraph 4.
2021/07/08
Committee: IMCO
Amendment 1127 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2. Wheren the Commission decidinitiates to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/19
Committee: JURI
Amendment 1128 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2. Where then Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/19
Committee: JURI
Amendment 1134 #
Proposal for a regulation
Article 15 b (new)
Article 15b Notification of suspicions of criminal offences 1. Where provider of hosting service becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall remove or disable the content and promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 2. Where the provider of hosting service cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. 3. For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located. 4. For the purpose of this Article, Member States shall notify to the Commission the list of its competent law enforcement or judicial authorities.
2021/07/08
Committee: IMCO
Amendment 1143 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, as well as individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the followingdecision taken by the online platform not to act after having received a notice, and against the decisions taken by the online platform on the ground that the information provided by the recipients is illegal content under Union or national law, or incompatible with its terms and conditions:
2021/07/08
Committee: IMCO
Amendment 1155 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable access to or restrict the visibility of the information;
2021/07/08
Committee: IMCO
Amendment 1167 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to restrict the ability to monetise content provided by the recipients;
2021/07/08
Committee: IMCO
Amendment 1170 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
(cb) decisions of online marketplaces to suspend the provisions of their services to traders;
2021/07/08
Committee: IMCO
Amendment 1173 #
Proposal for a regulation
Article 17 – paragraph 1 a (new)
1a. When the decision to remove or disable access to the information is followed by the transmission of this information in accordance with Article 15a, the period of at least six months as set out in paragraph 1 shall be considered to start from the day on which the recipient was informed in accordance with Article 15(2).
2021/07/08
Committee: IMCO
Amendment 1193 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that recipients of the service are given the possibility, where necessary, to contact a human interlocutor at the time of the submission of the complaint and that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means.
2021/07/08
Committee: IMCO
Amendment 1212 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and is legally distinct from and functionally independent of the government of the Member State or any other public or private body;
2021/07/08
Committee: IMCO
Amendment 1268 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content, as well as intentional manipulation and exploitation of the service in the sense of Article 26, paragraph 1(c);
2021/07/08
Committee: IMCO
Amendment 1273 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform, law enforcement, or other government or relevant commercial entity;
2021/07/08
Committee: IMCO
Amendment 1301 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
4a. Member States may recognise entities, that were awarded the status of trusted flaggers in another Member State as a trusted flagger on their own territory. Upon request by a Member State, trusted flaggers can be awarded the status of European trusted flagger by the Board, in accordance with Article 48, paragraph 2. The Commission shall keep register of European trusted flaggers.
2021/07/08
Committee: IMCO
Amendment 1313 #
Proposal for a regulation
Article 19 – paragraph 7
7. The Commission, after consulting the Board, mayshall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 4a, 5 and 6.
2021/07/08
Committee: IMCO
Amendment 1320 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content, or content that is in breach with their terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1351 #
Proposal for a regulation
Article 21
Notification of suspicions of criminal 1. aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 2. identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.Article 21 deleted offences Where an online platform becomes Where the online platform cannot
2021/07/08
Committee: IMCO
Amendment 1369 #
Proposal for a regulation
Article 22 – title
Traceability of traders on online marketplaces
2021/07/08
Committee: IMCO
Amendment 1373 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, itProviders of online marketplaces shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of itstheir services, the online platformmarketplaces hasve obtained the following information:
2021/07/08
Committee: IMCO
Amendment 1386 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bankpayment account details of the trader, where the trader is a natural person;
2021/07/08
Committee: IMCO
Amendment 1388 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
(d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning ofestablished in the Union and carrying out the tasks in accordance with Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or [Article XX of the General Product Safety Regulation], or any relevant act of Union law; __________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
2021/07/08
Committee: IMCO
Amendment 1399 #
Proposal for a regulation
Article 22 – paragraph 1 a (new)
1a. Providers of online marketplaces shall require traders to provide the information referred to in points (a) and (e) immediately upon initial registration for its services. Traders shall be required to provide any supplementary material relating to the information requirements set out in Article 22(1) within a reasonable period, and prior to the use of the service and offering of products and services to consumer.
2021/07/08
Committee: IMCO
Amendment 1405 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platformproviders of online marketplaces shall, upon receiving that information, make reasonable and before allowing traders to use their services, make best efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliablaccurate through the use of any freely accessible official online database or online interface made available by an authorized administrator or a Member States or the Union or through direct requests to the trader to provide supporting documents from reliable sources.
2021/07/08
Committee: IMCO
Amendment 1417 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Where the online platformproviders of online marketplaces obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platformonline marketplace shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/07/08
Committee: IMCO
Amendment 1418 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2
Where the trader fails to correct or complete that information, the online platformproviders of online marketplaces shall suspend the provision of its service to the trader in relations to the offering of products or services to consumers located in the Union until the request is fully complied with.
2021/07/08
Committee: IMCO
Amendment 1424 #
Proposal for a regulation
Article 22 – paragraph 3 a (new)
3a. The providers of online marketplaces shall ensure that traders are given the ability to discuss any information viewed as inaccurate or incomplete directly with a trader before any suspension of services. This may take the form of the internal complaint- handling system under Article 17.
2021/07/08
Committee: IMCO
Amendment 1426 #
Proposal for a regulation
Article 22 – paragraph 3 b (new)
3b. If an online marketplaces rejects an application for services or suspends services to a trader, the trader shall have recourse to the systems under Article 17 and Article 43 of this Regulation.
2021/07/08
Committee: IMCO
Amendment 1428 #
Proposal for a regulation
Article 22 – paragraph 3 c (new)
3c. Traders shall be solely liable for the accuracy the information provided and shall inform without delay the online marketplace of any changes to the information provided.
2021/07/08
Committee: IMCO
Amendment 1436 #
Proposal for a regulation
Article 22 – paragraph 4
4. The online platformproviders of online market places shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information.
2021/07/08
Committee: IMCO
Amendment 1438 #
Proposal for a regulation
Article 22 – paragraph 5
5. Without prejudice to paragraph 2, the platformroviders of online marketplaces shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation.
2021/07/08
Committee: IMCO
Amendment 1443 #
Proposal for a regulation
Article 22 – paragraph 6
6. The online platformproviders of online marketplaces shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner.
2021/07/08
Committee: IMCO
Amendment 1449 #
Proposal for a regulation
Article 22 – paragraph 7
7. The online platform shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law.deleted
2021/07/08
Committee: IMCO
Amendment 1461 #
Proposal for a regulation
Article 22 a (new)
Article 22a Compliance by design 1. Providers of online marketplaces shall design and organise their online interface in a fair and user-friendly way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law. 2. The online interface shall allow traders to provide in particular the information referred to under paragraph 6 of Article 22, the information referred to in Article 6 of Directive 2011/83/EU on Consumers Rights, information allowing for the unequivocal identification of the product or the service, and, where applicable, information on sustainability of products, information on labelling, including CE marking, according to the Union legislation on product safety and compliance. 3. This Article is without prejudice to additional requirements under other Union acts, including the [General Product Safety Regulation] and [Market Surveillance Regulation]
2021/07/08
Committee: IMCO
Amendment 1472 #
Proposal for a regulation
Article 23 – paragraph 1 – point c a (new)
(ca) the number of advertisements that were removed, labelled or disabled by the online platform and justification of the decisions;
2021/07/08
Committee: IMCO
Amendment 1478 #
Proposal for a regulation
Article 23 – paragraph 4
4. The Commission mayshall adopt implementing acts to establish a set of key performance indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
2021/07/08
Committee: IMCO
Amendment 1483 #
Proposal for a regulation
Article 24 – title
Online advertising transparency and control
2021/07/08
Committee: IMCO
Amendment 1487 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
(a) that the information displayed is an advertisementon the interface or parts thereof is an online advertisement, including through prominent and harmonised marking;
2021/07/08
Committee: IMCO
Amendment 1493 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c) clear, meaningful and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed. and the logic involved;
2021/07/08
Committee: IMCO
Amendment 1505 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
The online platform shall design and organise its online interface in such a way that recipients of the service can easily and efficiently exercise their rights under applicable Union law in relation to the processing of their personal data for each specific advertisement displayed to the data subject on the platform, in particular: (a) to withdraw consent or to object to processing; (b) to obtain access to the personal data concerning the data subject; (c) to obtain rectification of inaccurate personal data concerning the data subject; (d) to obtain erasure of personal data without undue delay; Where a recipient exercises any of these rights, the online platform must inform any parties to whom the personal data concerned in points (a)-(d) have been enclosed in accordance with Article 19 of Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1513 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Where a recipient exercises any of the rights referred to points (a), (c) or (d) in paragraph 2, the online platform must immediately cease displaying advertisements using the personal data concerned or using parameters which were set using this data.
2021/07/08
Committee: IMCO
Amendment 1515 #
Proposal for a regulation
Article 24 – paragraph 1 c (new)
Online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request and obtain information on where their advertisements have been placed; (b) can request and obtain information on which broker treated their data; (c) can indicate on which specific location their ads cannot be placed. In case of non-compliance with this provision, advertisers shall have the right to judicial redress.
2021/07/08
Committee: IMCO
Amendment 1526 #
Proposal for a regulation
Chapter III – Section 4 – title
4Additional obligations for very large online platforms, live streaming platforms, private messaging providers and search engines to manage systemic risks
2021/07/08
Committee: IMCO
Amendment 1528 #
Proposal for a regulation
Article 25 – title
Very large online platforms, live streaming platforms, private messaging providers and search engines
2021/07/08
Committee: IMCO
Amendment 1531 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platform services, live streaming platform services, private messaging services and search engine services which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
2021/07/08
Committee: IMCO
Amendment 1545 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), on an ongoing basis and at least once a year thereafter, any significthe probability and severity of anty systemic risks stemming from the design, intrinsic characteristics, functioning and use made of their services in the Union. The risk assessment shall be broken down per Member State in which services are offered and in the Union as a whole. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1555 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a) the dissemination of illegal content through their serviand content that is in breach of their terms and conditions through their services, including unsafe and non- compliant products and services, in case of online marketplaces;
2021/07/08
Committee: IMCO
Amendment 1560 #
Proposal for a regulation
Article 26 – paragraph 1 – point a a (new)
(aa) the funding of illegal content, including models based on advertisement;
2021/07/08
Committee: IMCO
Amendment 1564 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the EU Charter on Fundamental Rights , in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1573 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service and amplification of content that is in breach of their terms and conditions, including by means of inauthentic use, such as ‘deep fakes’ or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, democratic values, media freedom and freedom of expression of journalists, as well as their ability to verify facts, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/08
Committee: IMCO
Amendment 1584 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how and whether their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is in compatible with their terms and conditions, as well as potential infringement of consumer rights by business active on the platform or platform themselves.
2021/07/08
Committee: IMCO
Amendment 1593 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
2a. When conducting risk assessments, very large online platforms shall involve representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. Their involvement shall be tailored to the specific systemic risks that the very large online platform aim to assess.
2021/07/08
Committee: IMCO
Amendment 1601 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measureseasures to mitigate the probability and severity of any, tailored to address the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1609 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, design, the features or functioning of their services, or their terms and conditions;
2021/07/08
Committee: IMCO
Amendment 1614 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of and targeting of advertisements in association with the service they provide;
2021/07/08
Committee: IMCO
Amendment 1627 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. The Board shall evaluate the implementation and effectiveness of mitigating measures undertaken by very large online platforms listed in Article 27(1) and where necessary, may issue recommendations.
2021/07/08
Committee: IMCO
Amendment 1630 #
Proposal for a regulation
Article 27 – paragraph 1 b (new)
1b. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, to the Board in view of issuing specific recommendations and to independent auditors for the purposes of the audit report. Following the written explanation of the reasons of the very large online platforms not to put in place mitigating measures, and where necessary, the Board shall issue specific recommendations as to the mitigation measures that very large online platforms shall implement. Very large online platforms shall within one month from receiving of these recommendations, implement the recommended measures, or set out any alternative measures they intend to take to address the identified risks. In case of systemic failure of a very large online platform to take effective mitigating measures and in case of repeated non-compliance with the recommendations, the Board may advise the Commission and the Digital Services Coordinators to impose sanctions.
2021/07/08
Committee: IMCO
Amendment 1631 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which. The reports of the Board shall be broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. The reports shall include the following:
2021/07/08
Committee: IMCO
Amendment 1637 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrenteach of the systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
2021/07/08
Committee: IMCO
Amendment 1644 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, mayand following public consultations shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
2021/07/08
Committee: IMCO
Amendment 1659 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III, in particular the quality of the identification, analysis and assessment of the risks referred to in Article26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27;
2021/07/08
Committee: IMCO
Amendment 1672 #
Proposal for a regulation
Article 28 – paragraph 2 – point c a (new)
(ca) have been certified by the Commission for the performance of this task;
2021/07/08
Committee: IMCO
Amendment 1676 #
Proposal for a regulation
Article 28 – paragraph 3 – point f a (new)
(fa) a description of specific elements that could not be audited, and an explanation of why these could not be audited;
2021/07/08
Committee: IMCO
Amendment 1678 #
Proposal for a regulation
Article 28 – paragraph 3 – point f b (new)
(fb) where the audit opinion could not reach a conclusion for specific elements within the scope of the audit, a statement of reasons for the failure to reach such conclusion.
2021/07/08
Committee: IMCO
Amendment 1692 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions and on a designated web page that can be directly reached from the very large online platforms’ online interface, in a clear, accessible and easily comprehensible manner for the general public, the main parameters used in their recommender systems, the optimisation goals of their recommender systems as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1701 #
Proposal for a regulation
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide clear and easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
2021/07/08
Committee: IMCO
Amendment 1704 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2a. Online platforms shall ensure that their online interface is designed in such a way that it does not risk misleading or manipulating the recipients of the service.
2021/07/08
Committee: IMCO
Amendment 1709 #
Proposal for a regulation
Article 30 – title
Additional online advertising transparencytransparency for online advertisements and ‘deep fakes’ audiovisual media
2021/07/08
Committee: IMCO
Amendment 1716 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available to relevant authorities and vetted researchers, meeting the requirements of Article 31(4), through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/08
Committee: IMCO
Amendment 1721 #
Proposal for a regulation
Article 30 – paragraph 2 – point a
(a) the content of the advertisement, including the name of the product, service or brand and the object of the advertisement;
2021/07/08
Committee: IMCO
Amendment 1724 #
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
(ba) the natural or legal person who paid for the advertisement;
2021/07/08
Committee: IMCO
Amendment 1732 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached in each country and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.
2021/07/08
Committee: IMCO
Amendment 1739 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. The Board shall, after consulting trusted flaggers and vetted researchers, publish guidelines on the structure and organisation on repositories created pursuant to paragraph 1.
2021/07/08
Committee: IMCO
Amendment 1744 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2b. Very large online platforms shall make their best effort to detect inauthentic videos (‘deep fakes’). When detecting such videos, they should label them as inauthentic in a way that is clearly visible for the internet user.
2021/07/08
Committee: IMCO
Amendment 1746 #
Proposal for a regulation
Article 30 – paragraph 2 c (new)
2c. Very large online platforms selling advertising for display on their online interface, shall ensure via standard contractual clauses with the purchasers of advertising space that the content with which the advertisement is associated is compliant with the terms and conditions of the platform, or with the law of the Member States where the recipients of the service to whom the advertisement will be displayed is located.
2021/07/08
Committee: IMCO
Amendment 1747 #
Proposal for a regulation
Article 30 – paragraph 2 d (new)
2d. Very large online platforms that display advertising on their online interfaces shall conduct at their own expense, and upon request of advertisers , independent audits performed by organisations complying with the criteria set out in Article 28(2). Such audits shall be based on fair and proportionate conditions agreed between platforms and advertisers, shall be conducted with a reasonable frequency and shall entail: (a) conducting quantitative and qualitative assessment of cases where advertising is associated with illegal content or with content incompatible with platforms’ terms and conditions; (b) monitoring for and detecting of fraudulent use of their services to fund illegal activities; (c) assessing the performance of their tools in terms of brand safety. The audit report shall include opinion on the performance of platforms’ tools in terms of brand safety. Where the audit opinion is not positive, the report shall make operational recommendations to the platforms on specific measures in order to achieve compliance. The platforms shall make available to advertisers, upon request, the results of such audit.
2021/07/08
Committee: IMCO
Amendment 1750 #
Proposal for a regulation
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable periodout delay, specified in the request, full access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. With regard to moderation and recommender systems, very large online platforms shall provide upon request the Digital Services Coordinator or the Commission with access to algorithms and associated data that allow the detection of possible biases which could lead to the dissemination of illegal content, or content that is in breach with their terms and conditions, or presents threats to fundamental rights including freedom of expression. Where a bias is detected, very large online platforms shall expeditiously correct it following the recommendations of the Digital Services Coordinator or the Commission. Very large online platforms should be able to demonstrate their compliance at every step of the process pursuant to this Article.
2021/07/08
Committee: IMCO
Amendment 1757 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment, three Digital Services Coordinators of destination or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding and mitigation of systemic risks as set out in Articles 26(1) and 27.
2021/07/08
Committee: IMCO
Amendment 1765 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, scientific researchers shall be affiliated with academic institutions, be independent from commercial interests and the very large online platform it seeks data from, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
2021/07/08
Committee: IMCO
Amendment 1775 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: for the following reasons: (a) in case of request under paragraph 1, a very large online platform does not have and cannot obtain with reasonable effort access to the data; (b) in case of request under paragraph 2, a very large online platform does not have access to the data or providing access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets.
2021/07/08
Committee: IMCO
Amendment 1788 #
Proposal for a regulation
Article 31 – paragraph 7 a (new)
7a. Upon completion of the research envisaged in Article 31(2), the vetted researchers shall make their research publicly available, taking into account the rights and interests of the recipients of the service concerned in compliance with Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1789 #
Proposal for a regulation
Article 31 – paragraph 7 b (new)
7b. Digital Service Coordinators and the Commission shall, once a year, report the following information: (a) the number of requests made to them as referred to in paragraphs 1 and 2; (b) the number of such requests that have been declined by the Digital Service Coordinator or the Commission and the reasons for which they have been declined; (c) the number of such requests that have been declined by the Digital Service Coordinator or the Commission, including the reasons for which they have been declined, following a request to the Digital Service Coordinator or the Commission from a very large online platform to amend a request as referred to in paragraphs 1 and 2.
2021/07/08
Committee: IMCO
Amendment 1802 #
Proposal for a regulation
Article 33 – paragraph 2 a (new)
2a. The reports shall include content moderation broken down per Member State in which the services are offered and in the Union as a whole and shall be published in the official languages of the Member States of the Union.
2021/07/08
Committee: IMCO
Amendment 1858 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission mayshall invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/08
Committee: IMCO
Amendment 1867 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain a set of harmonised key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain in order to facilitate effective cross-platform monitoring.
2021/07/08
Committee: IMCO
Amendment 1870 #
Proposal for a regulation
Article 35 – paragraph 4
4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusion, and publish their conclusions. Furthermore, they shall ensure that there is common alert mechanism managed at Unions level to allow for real-time and coordinated responses.
2021/07/08
Committee: IMCO
Amendment 1873 #
Proposal for a regulation
Article 35 – paragraph 5
5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic and repetitive failure to comply with the Codes of Conduct, the Board shall as a measure of last resort take a decision to temporary suspend or definitely exclude platforms that do not meet their commitments as a signatory to the Codes of Conduct.
2021/07/08
Committee: IMCO
Amendment 1881 #
Proposal for a regulation
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency infor all players in the online advertising value chain. beyond the requirements of Articles 24 and 30.
2021/07/08
Committee: IMCO
Amendment 1888 #
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
(ba) the setting-up of unique identifier that will enable advertisers and publishers to identify and track a campaign throughout its lifecycle.
2021/07/08
Committee: IMCO
Amendment 1890 #
Proposal for a regulation
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall evaluate the application of those codes two years after the application of this Regulation.
2021/07/08
Committee: IMCO
Amendment 1891 #
Proposal for a regulation
Article 36 – paragraph 3 a (new)
3a. The Commission shall encourage all the players in the online advertising value chain to endorse and comply with the commitments stated in the codes of conduct.
2021/07/08
Committee: IMCO
Amendment 1917 #
Proposal for a regulation
Article 38 – paragraph 4 a (new)
4a. Member States shall ensure that the competent authorities have adequate financial and human resources, as well as legal and technical expertise to fulfil their tasks under this Regulation.
2021/07/08
Committee: IMCO
Amendment 1928 #
Proposal for a regulation
Article 40 – paragraph 1 a (new)
1a. By means of derogation from paragraph 1, the Member State in which the consumers have their residence shall have jurisdiction for the purposes of Articles 22, 22a and 22b of this Regulation and the Member State in which the authority issuing the order is situated shall have jurisdiction for the purposes of Articles 8 and 9 of this Regulation.
2021/07/08
Committee: IMCO
Amendment 1939 #
Proposal for a regulation
Article 40 – paragraph 4
4. Paragraphs 1,1a, 2 and 3 are without prejudice to Article 43(2), the second subparagraph of Article 50(4) and the second subparagraph of Article 51(2) and the tasks and powers of the Commission under Section 3.
2021/07/08
Committee: IMCO
Amendment 1954 #
Proposal for a regulation
Article 41 – paragraph 3 a (new)
3a. Following request to the Commission and in cases of infringements that persist, could cause serious harm to recipients of the service, or could seriously affect their fundamental rights, the Digital Services Coordinator of the country of destination may be entitled to additional powers in the framework of joint investigations as referred to in Article 46.
2021/07/08
Committee: IMCO
Amendment 1972 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Where the complaint concerns an alleged harm upon the recipients of the service, the Member State where the recipient resides shall have jurisdiction for the purposes of the complaint.
2021/07/08
Committee: IMCO
Amendment 1974 #
Proposal for a regulation
Article 43 a (new)
Article 43a Rights to effective judicial remedies 1. Without prejudice to any available administrative or non-judicial remedy, any recipient of the service or representative organisations shall have the right to an effective judicial remedy where he or she suffered harm as a result of an infringement of Articles 26(1) and 27(1). 2. In determining whether the very large online platform has complied with its obligations under Article 27(1), and in light of the principle of proportionality, the availability of suitable and effective measures shall be taken into account. 3. Such proceedings may be brought before the courts of the Member State where the recipient of the service has his or her habitual residence. 4. Without prejudice to any other administrative or non-judicial remedy, any recipients of the service or representative organisations shall have the right to an effective judicial remedy where the Digital Service Coordinator which is competent pursuant to Articles 40 and 43 does not handle a complaint or does not inform the recipient of the service within three months on the progress or outcome of the complaint lodged pursuant to Article 43. Proceedings against a Digital Services Coordinator under paragraph 4 shall be brought before the courts of the Member State where the Digital Services Coordinator is established.
2021/07/08
Committee: IMCO
Amendment 1986 #
Proposal for a regulation
Article 45 – paragraph 1 – subparagraph 2
Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may recommendshall request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation.
2021/07/08
Committee: IMCO
Amendment 1989 #
Proposal for a regulation
Article 45 – paragraph 2 – introductory part
2. A request or recommendation pursuant to paragraph 1 shall at least indicate:
2021/07/08
Committee: IMCO
Amendment 1996 #
Proposal for a regulation
Article 45 – paragraph 3
3. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1 and assess the matter in view of taking specific investigatory or enforcement measures to ensure compliance without undue delay. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, and tor the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided.
2021/07/08
Committee: IMCO
Amendment 1998 #
Proposal for a regulation
Article 45 – paragraph 4
4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto and a statement of reason in case of decision, following its investigation, not to take measures to ensure compliance with this Regulation.
2021/07/08
Committee: IMCO
Amendment 2002 #
Proposal for a regulation
Article 45 – paragraph 5
5. Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the Commission and the Digital Services Coordinators, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4.
2021/07/08
Committee: IMCO
Amendment 2006 #
Proposal for a regulation
Article 45 – paragraph 6
6. The Commission , in cooperation with the Digital Services Coordinators shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Board.
2021/07/08
Committee: IMCO
Amendment 2010 #
Proposal for a regulation
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission in cooperation with the Digital Services Coordinators concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. Where the Digital Services Coordinator of establishment fails to comply with the request to take the necessary measures before the end of the two months period, the Commission shall reallocate the case without delay to the Digital Services Coordinator initiating the request.
2021/07/08
Committee: IMCO
Amendment 2014 #
Proposal for a regulation
Article 46 – title
Joint investigations, cooperation among Digital Services Coordinators and requests for Commission intervention
2021/07/08
Committee: IMCO
Amendment 2020 #
Proposal for a regulation
Article 46 – paragraph 1 a (new)
1a. Where Digital Services Coordinator of the country of destination considers that an alleged infringement exist and causes serious harm to a large number of recipients of the service in that Member States, or could seriously affect their fundamental rights, it may request to the Commission to set up joint investigations between Digital Services Coordinator of country of establishment and the requesting Digital Services Coordinator of country of destination.
2021/07/08
Committee: IMCO
Amendment 2021 #
Proposal for a regulation
Article 46 – paragraph 1 b (new)
1b. The Commission, in cooperation with the Digital Services Coordinators, shall assess such request and following positive opinion of the Board shall set up a joint investigation where the Digital Services Coordinator of the country of destination can be entitled to exercise the following additional powers with respect to the provider of intermediary services concerned by the alleged infringement: (a) to obtain access to the confidential version of the reports published by the intermediary service providers referred to in Article 13 and where applicable in Articles 23 and 24, as well as to the annual reports drawn up by the other competent authorities pursuant to Article 44; (b) to obtain access to data collected by the Digital Services Coordinator of the country of establishment for the purpose of supervision of that provider on the territory of the Digital Services Coordinator of the country of destination; (c) to initiate proceedings and assess the matter in view of taking specific investigatory or enforcement measures to ensure compliance, where the suspected seriousness of the infringement would require immediate response that would not allow for the provisions of Article 45 to apply; and (d) to request interim measures, as referred to in Article 41(2)(e).
2021/07/08
Committee: IMCO
Amendment 2022 #
Proposal for a regulation
Article 46 – paragraph 1 c (new)
1c. The Commission decision setting up the joint investigation shall define a deadline by when Digital Services Coordinator of the country of establishment and Digital Services Coordinator launching the request pursuant to paragraph 2 shall agree on a common position on the joint investigation, and where applicable on the enforcement measures to be adopted. If no agreement is reached within this deadline, the case shall be referred to the Commission pursuant to Article 45(5).
2021/07/08
Committee: IMCO
Amendment 2082 #
Proposal for a regulation
Article 49 – paragraph 1 – point c a (new)
(ca) issue specific recommendations for the implementation of Article 27 and advise on possible application of sanctions in case of repeated non-compliance;
2021/07/08
Committee: IMCO
Amendment 2100 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period.
2021/07/08
Committee: IMCO
Amendment 2119 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/08
Committee: IMCO
Amendment 2131 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/08
Committee: IMCO