BETA

Activities of Evelyne GEBHARDT related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (170)

Amendment 96 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and, the freedom to conduct a business, andprivacy and personal data protection, the right to non-discrimination and access to justice.
2021/07/20
Committee: JURI
Amendment 112 #
Proposal for a regulation
Recital 10
(10) For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council30 and Regulation (EU) 2019/1150 of the European Parliament and of the Council,31 , Directive 2002/58/EC of the European Parliament and of the Council32 and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC33 as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council34 , Directive 2011/83/EU of the European Parliament and of the Council35 and Directive 93/13/EEC of the European Parliament and of the Council36 , as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37 , Directive 2013/11/EC of the European Parliament and of the Council, Directive 2006/123/EC of the European Parliament and of the Council, and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council.38 The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions. _________________ 30Regulation (EU) 2019/1148 of the European Parliament and of the Council on the marketing and use of explosives precursors, amending Regulation (EC) No 1907/2006 and repealing Regulation (EU) No 98/2013 (OJ L 186, 11.7.2019, p. 1). 31 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57). 32Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, 31.7.2002, p. 37. 33Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC. 34 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 35Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council. 36Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts. 37Directive (EU) 2019/2161 of the European Parliament and of the Council of 27 November 2019 amending Council Directive 93/13/EEC and Directives 98/6/EC, 2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules 38Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (OJ L 119, 4.5.2016, p. 1).
2021/07/20
Committee: JURI
Amendment 124 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating toand cover illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relateis not in compliance with Union law since it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/20
Committee: JURI
Amendment 149 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the Charter of Fundamental Rights of the European Union, including the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/07/20
Committee: JURI
Amendment 158 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner and accompanied by additional safeguards. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/07/20
Committee: JURI
Amendment 168 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content or as an obligation to use automated content- filtering tools.
2021/07/20
Committee: JURI
Amendment 178 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. In this context and to maintain proportionality, orders addressed to a provider that has its main establishment or legal representation in another Member State or outside the Union should be limited to the Member State issuing the order, unless the legal basis for the order is directly applicable Union law.
2021/07/20
Committee: JURI
Amendment 188 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant and up-to- date information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
2021/07/20
Committee: JURI
Amendment 190 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. In particular, it is important to ensure that terms and conditions are fair, non- discriminatory and transparent, and are drafted in a clear and unambiguous language in line with applicable Union law. The terms and conditions should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, human review, the legal consequences to be faced by the users for knowingly storing or uploading illegal content as well as on the right to terminate the use of the service. Providers of intermediary services should also provide recipients of services with a concise and easily readable summary of the main elements of the terms and conditions, including the remedies available.
2021/07/20
Committee: JURI
Amendment 201 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
2021/07/20
Committee: JURI
Amendment 211 #
Proposal for a regulation
Recital 9
(9) This Regulation fully harmonises the rules applicable to intermediary services when dealing with illegal content online in the internal market to ensure a safe, predictable and trusted online environment where fundamental rights enshrined in the Charter are effectively protected, in order to improve the functioning of the Internal Market. Accordingly, Member States should not adopt or maintain additional national requirements on those matters falling within the scope of this Regulation, unless this would affect the direct and uniform application of the fully harmonised rules applicable to the providers of intermediary services in which are necessary to ensure the proper function of the internal market. The Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/07/08
Committee: IMCO
Amendment 212 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation should apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national levend should be without prejudice to the Member States’ possibility to adopt and further develop laws, regulations and other measures, which serve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media or opinion and cultural or linguistic diversity. In particular, in the event of a conflict between Directive 2010/13/EU and the present Regulation, the provisions of the Directive 2010/13/EU should prevail. Similarly, legislation that is in accordance with Directive 2010/13/EU at national level, aiming at securing and fostering the fulfilment of various objectives underpinning the audiovisual policy of the Union and its Member States, should also prevail. __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/07/08
Committee: IMCO
Amendment 212 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/07/20
Committee: JURI
Amendment 217 #
Proposal for a regulation
Recital 9 a (new)
(9a) The right of the Member States to provide additional obligations, exemptions or derogations, which serve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media or opinion and cultural or linguistic diversity, should remain unaffected. Because of the convergence of media, legislation and other measures that ensure and promote media pluralism may be necessary for the entire online environment. The right of the Member States especially includes substantive rules, rules of procedure and enforcement rules, including the regulatory structure.
2021/07/08
Committee: IMCO
Amendment 218 #
Proposal for a regulation
Recital 9 b (new)
(9b) Respecting the Union’s subsidiary competence to take cultural aspects into account in its action according to Article 167( 4) of the Treaty on the Functioning of the European Union, this Regulation should not affect Member States’ competences in their respective cultural policies, nor should it prejudice national measures addressed to intermediary service providers in order to protect the freedom of expression and information, media freedom and to foster media pluralism as well as cultural and linguistic diversity.
2021/07/08
Committee: IMCO
Amendment 221 #
Proposal for a regulation
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. Dispute resolution proceedings should be concluded within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/07/20
Committee: JURI
Amendment 227 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content in a designated area of expertise, that they represent collective, non-commercial interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/20
Committee: JURI
Amendment 233 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law where that is consistentin conformity with Union law and what the precise nature or subject matter is of the law in question.
2021/07/08
Committee: IMCO
Amendment 246 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end. Given the significant risks that arise from targeted advertising, including wthen this is based on profiling amplification of illegal or harmful content and other risks associated with the reliance on pervasive tracking and data mining, targeting of advertising based on personal data should be prohibited. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it. In this context, it is important to highlight that consent to targeted advertising should not be considered as freely given, specific and thus valid if access to the service is made conditional on processing of personal data and profiling techniques outside of the control of the user. This Regulation is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/20
Committee: JURI
Amendment 274 #
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including at least one default options that areis not based on profiling of the recipient and alternative, third-party recommender systems where technically possible.
2021/07/19
Committee: JURI
Amendment 290 #
Proposal for a regulation
Recital 65 a (new)
(65 a) Recipients of a service are often locked in to existing platforms due to network effects, which significantly limits user choice. In order to facilitate free choice of recipients between different services, it is therefore important to consider interoperability for industry- standard features of very large online platforms, such as core messaging functionality or image-sharing services. Such interoperability would empower recipients to choose a service based on its functionality and features such as security, privacy, and data processing standards, rather than its existing user base.
2021/07/19
Committee: JURI
Amendment 292 #
Proposal for a regulation
Recital 66
(66) To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate.
2021/07/19
Committee: JURI
Amendment 320 #
Proposal for a regulation
Recital 28 a (new)
(28a) Since media service providers hold editorial responsibility for the content and services they make available, such content and services should benefit from a specific regime that prevents a multiple control of those content and services. Those content and services are typically offered in accordance with professional and journalistic standards as well as legislation and are already subject to systems of supervision and control, often enshrined in commonly accepted self- regulatory standards and codes. In addition, media service providers usually have in place complaints handling mechanisms to resolve content-related disputes. Editorial responsibility means the exercise of effective control both over the selection of content and over its provision by means of its presentation, composition and organisation. Editorial responsibility does not necessarily imply any legal liability under national law for the content or the services provided. Intermediary service providers should refrain from removing, suspending or disabling access to any such content or services. Intermediary service providers should be exempt from liability for content and services offered by media service providers. A presumption of legality should exist in relation to the content and services provided by media service providers who carry out their activities in respect of European values and fundamental rights. Compliance by media service providers with these rules and regulations should be overseen by the respective independent regulatory authorities, bodies or both and the respective European networks they are organised in.
2021/07/08
Committee: IMCO
Amendment 341 #
Proposal for a regulation
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
2021/07/08
Committee: IMCO
Amendment 360 #
Proposal for a regulation
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. In addition, recipients of intermediary services should be able to hold the legal representative liable for non-compliance.
2021/07/08
Committee: IMCO
Amendment 365 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
(i a) Directive 2006/123/EC
2021/07/19
Committee: JURI
Amendment 375 #
Proposal for a regulation
Recital 39 a (new)
(39a) In order to effectively and meaningfully address the proliferation of illegal goods and services online, intermediary services should implement measures to prevent illicit content from reappearing after having been taken down. Such measures, undertaken horizontally by all intermediary services, will contribute to a safer online environment.
2021/07/08
Committee: IMCO
Amendment 383 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
2021/07/19
Committee: JURI
Amendment 402 #
Proposal for a regulation
Article 2 – paragraph 1 – point q
(q) ‘terms and conditions’ means all terms and conditions or specifications provided by the provider of intermediary services, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.
2021/07/19
Committee: JURI
Amendment 405 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(q a) ‘manifestly illegal content’ means any information which is unmistakably and without requiring in-depth examination in breach of legal provisions regulating the legality of content online.
2021/07/19
Committee: JURI
Amendment 435 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, oror temporarily disabling of access to, manifestly illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.
2021/07/19
Committee: JURI
Amendment 436 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Providers of intermediary services shall notify the competent judicial or administrative authority and the recipient of the service concerned about detection and/or disabling of access to manifestly illegal content without undue delay; notified authorities shall authorise the permanent removal of the content notified;
2021/07/19
Committee: JURI
Amendment 437 #
Proposal for a regulation
Article 6 – paragraph 1 b (new)
Voluntary own-initiative investigations shall not lead to ex-ante control measures based on automated content moderation tools.
2021/07/19
Committee: JURI
Amendment 438 #
Proposal for a regulation
Article 6 – paragraph 1 c (new)
Providers of intermediary services shall ensure that such measures are accompanied by appropriate safeguards, such as human oversight, documentation, traceability, transparency of algorithms used or additional measures to ensure the accuracy, fairness, transparency and non- discrimination of voluntary own-initiative investigations.
2021/07/19
Committee: JURI
Amendment 441 #
Proposal for a regulation
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. Providers of intermediary services shall not be obliged to use automated tools for content moderation.
2021/07/19
Committee: JURI
Amendment 447 #
Proposal for a regulation
Recital 50 a (new)
(50a) After having obtained the necessary contact information of a trader, which are aimed at ensuring consumer rights, a provider of intermediary services needs to verify that these details are consistently being updated and accessible for consumers. Therefore, it shall conduct regular and randomized checks on the information provided by the traders on its platform. To ensure a consistent display of these contact information, intermediary services should establish mandatory designs for the inclusion of these contact information. A content, good or service shall only be displayed after all necessary information are made available by the business user.
2021/07/08
Committee: IMCO
Amendment 452 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
— a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed with due regard to fundamental rights of the recipient of the service concerned;
2021/07/19
Committee: JURI
Amendment 454 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
- identification of the competent judicial or administrative authority;
2021/07/19
Committee: JURI
Amendment 455 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 b (new)
- reference to the legal basis for the order;
2021/07/19
Committee: JURI
Amendment 457 #
(b) the territorial scope of the order, addressed to a provider that has its main establishment in the Member State issuing the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective;
2021/07/19
Committee: JURI
Amendment 460 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
(b a) the territorial scope of an order addressed to a provider that has its main establishment or legal representation in another Member State or outside the Union, is limited to the territory of the Member State issuing the order, unless the legal basis for the order is directly applicable Union law;
2021/07/19
Committee: JURI
Amendment 461 #
Proposal for a regulation
Recital 52 a (new)
(52a) The market position of very large online platforms allows them to collect and combine enormous amounts of personal data, thereby strengthening their market position vis-a-vis smaller competitors, while at the same time incentivising other online platforms to take part in comparable data collection practices and thus creating an unfavourable environment for consumers. Therefore, the collecting and further processing of personal data for the purpose of displaying tailored advertisement should be prohibited. The selection of advertisements shown to a consumer should consequently be based on contextual information, such as language settings by the device of the user or the digital location. Besides a positive effect on privacy and data protection rights of users, the ban will increase competition on the market and will facilitate market access for smaller online platforms and privacy-friendly business models.
2021/07/08
Committee: IMCO
Amendment 465 #
Proposal for a regulation
Recital 52 b (new)
(52b) The ban on targeted advertising should not hinder contextual advertisement, such as the displaying of a car advertisement on a website presenting information from the automotive sector.
2021/07/08
Committee: IMCO
Amendment 488 #
Proposal for a regulation
Recital 58 a (new)
(58a) Mitigation of risks, which would lead to removal, disabling access to or otherwise interfering with media services and content for which a media service provider holds editorial responsibility, should not be considered reasonable or proportionate.
2021/07/08
Committee: IMCO
Amendment 496 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, with due regard to fundamental rights of the recipient of the service concerned, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
2021/07/19
Committee: JURI
Amendment 499 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
- identification of the competent judicial or administrative authority;
2021/07/19
Committee: JURI
Amendment 501 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 b (new)
- reference to the legal basis for the order;
2021/07/19
Committee: JURI
Amendment 508 #
Proposal for a regulation
Recital 65 a (new)
(65a) Due to their market position, very large online platforms have developed an increasing influence over society’s social, economic, and political interactions. Consumers face a lock-in situation, which may lead them into accepting unfavourable terms and conditions to participate in the services provided by these very large online platforms. To restore a competitive market and to allow consumers more choices, very large online platforms should be required to setup the necessary technical access points to create interoperability for their core services, with a view to allowing competitors a fairer market access and enabling more choice for consumers, while at the same time complying with privacy, security and safety standards. These access points should create interoperability for other online platform services of the same type, without the need to convert digital content or services to ensure functionality.
2021/07/08
Committee: IMCO
Amendment 513 #
Proposal for a regulation
Article 9 – paragraph 4 a (new)
4 a. The obligations under this Article shall not oblige providers of intermediary services to introduce new tracking of profiling techniques for recipients of the service in order to comply with orders to provide information.
2021/07/19
Committee: JURI
Amendment 530 #
Proposal for a regulation
Recital 70 a (new)
(70a) The Commission should encourage the development of codes of conduct to facilitate online platforms’ verification of short-term holiday rental providers’ compliance with national registration and authorisation schemes. Such codes of conduct should aim in particular at establishing effective cooperation mechanisms between online platforms and public authorities on short term holiday rentals.
2021/07/08
Committee: IMCO
Amendment 534 #
Proposal for a regulation
Recital 73
(73) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be identified as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure regular reporting and effective involvement of all relevant authorities in the supervision and enforcement at Union level.
2021/07/08
Committee: IMCO
Amendment 534 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used by the provider of the intermediary service for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format and include a searchable archive of previous versions of the provider’s terms and conditions.
2021/07/19
Committee: JURI
Amendment 536 #
Proposal for a regulation
Recital 73 a (new)
(73a) The designation of a Digital Services Coordinator in the Member Stat should be without prejudice to already existing enforcement mechanisms, such as in electronical communication or media regulation, and independent regulatory structures in these fields as defined by European and national law. The competences of the Digital Services Coordinator should not interfere with those of the appointed authorities. For ensuring coordination and for contributing to the effective consistent application and enforcement of this Regulation throughout the Union, the different European networks, in particular the European Regulators Group for Audiovisual Media Services (ERGA) and the Body of European Regulators for Electronic Communications (BEREC), should be responsible. For the effective implementation of this task, these networks should develop suitable procedures to be applied in cases concerning this Regulation.
2021/07/08
Committee: IMCO
Amendment 547 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies available.
2021/07/19
Committee: JURI
Amendment 581 #
Proposal for a regulation
Article 13 a (new)
Article 13a Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of displaying digital advertising to a specific recipient or group of recipients. 2. This provision shall not prevent intermediary services from displaying targeted digital advertising based on contextual information such as keywords, the language or the approximate geographical location of the recipient of the service to whom the advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if the advertisement is displayed in real time and it does not allow for the direct or, by means of combining it with other information, indirect identification of a natural person or group of persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or group of persons.
2021/07/19
Committee: JURI
Amendment 584 #
Proposal for a regulation
Article 13 b (new)
Article 13b Online interface design 1. Providers of intermediary services shall refrain from subverting or impairing autonomous decision-making or free choice of a recipient of a service through the design, functioning or operation of online interfaces or a part thereof, such as but not limited to: (a) according visual prominence to one option when asking the recipient of the service for consent or a decision; (b) repeatedly requesting consent to data processing or requesting a change to a setting or configuration of the service after the recipient of the service has already made her choice; (c) making the procedure of cancelling a service more difficult than signing up to it. 2. A choice or decision by the recipient of the service using an online interface that does not comply with the requirements of this article shall not constitute consent in accordance with Regulation (EU) 2016/679. 3. The Commission shall be empowered to publish guidelines indicating specific design choices that qualify as subverting or impairing the autonomy, decision-making processes or choices of the recipient of the service.
2021/07/19
Committee: JURI
Amendment 588 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, clearly visible on the hosting service interface, and allow for the submission of notices exclusively by electronic means and in the language of the individual or entity submitting a notice.
2021/07/19
Committee: JURI
Amendment 589 #
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
2021/07/19
Committee: JURI
Amendment 593 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
(a) an sufficiently substantiated explanation of the reasons why the individual or entity considers the information in question to be illegal content;
2021/07/19
Committee: JURI
Amendment 597 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particularsuch as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
2021/07/19
Committee: JURI
Amendment 599 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down harmonised rules on the provision of intermediary services in order to improve the functioning of the internal market. In particular, it establishes:
2021/07/08
Committee: IMCO
Amendment 604 #
Proposal for a regulation
Article 1 – paragraph 2 – point a
(a) contribute to the proper functioning of the internal market for intermediary services to ensure fair competition;
2021/07/08
Committee: IMCO
Amendment 606 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including a high level of consumer protection, are effectively protected.
2021/07/08
Committee: IMCO
Amendment 606 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 on the basis of which a diligent economic operator can identify the illegality of the content in question shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
2021/07/19
Committee: JURI
Amendment 624 #
Proposal for a regulation
Article 1 – paragraph 5 – introductory part
5. This Regulation is without prejudice toshall and will not affect the rules laid down by the following:
2021/07/08
Committee: IMCO
Amendment 639 #
Proposal for a regulation
Article 15 a (new)
Article 15a Content moderation 1. Providers of hosting services shall not use ex-ante control measures for content moderation based on automated tools or ex-ante filtering of content. Where providers of hosting services use automated tools for content moderation, they shall ensure qualified human oversight for any action taken and that legal content which does not infringe the terms and conditions set out by the provider is not affected. This paragraph shall not apply to moderating information which has most likely been provided by automated tools. 2. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory, diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service. Content moderation practices shall be proportionate to the type and volume of content, relevant and limited to what is necessary for the purposes for which the content is moderated. 3. Providers of hosting services shall not subject recipients of the service to discriminatory practices, exploitation or exclusion for the purposes of content moderation, such as removal of user- generated content based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social class.
2021/07/19
Committee: JURI
Amendment 640 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU) 2019/882
2021/07/08
Committee: IMCO
Amendment 642 #
Proposal for a regulation
Article 15 b (new)
Article 15b Content moderation staff Providers of hosting services shall ensure adequate qualification of staff working on content moderation, including ongoing training on the applicable legislation and fundamental rights. The provider shall also provide appropriate working conditions including the opportunity to seek professional support, qualified psychological assistance and qualified legal advice.
2021/07/19
Committee: JURI
Amendment 643 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
5a. This Regulation shall not affect the possibility of Member States to adopt new legislation as well as to take regulatory measures, especially with regard to intermediary service providers that serve a legitimate public interest, in particular to protect the freedom of information and media or to foster the diversity of media and opinion or of cultural and linguistic diversity.
2021/07/08
Committee: IMCO
Amendment 644 #
Proposal for a regulation
Article 1 a (new)
Article 1a Contractual provisions 1. Any contractual provisions between an intermediary service provider and a trader, business user, or a recipient of its service which are contrary to this Regulation shall be unenforceable. 2. This Regulation shall apply irrespective of the law applicable to contracts concluded between providers of intermediary services and a recipient of the service, a consumer, a trader or business user.
2021/07/08
Committee: IMCO
Amendment 645 #
Proposal for a regulation
Article 1 a (new)
Article 1a No circumvention of the rules set out in this Regulation 1. Any contractual provision between an intermediary service provider and a recipient of its service, between an intermediary service provider and a trader or between a recipient of its service and a trader, which is contrary to this Regulation, is invalid. 2. This Regulation shall apply irrespective of the law applicable to contracts.
2021/07/08
Committee: IMCO
Amendment 646 #
Proposal for a regulation
Article 1 a (new)
Article 1a Objective The aim of this Regulation is to contribute to the proper functioning of the internal market by setting out harmonised rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/07/08
Committee: IMCO
Amendment 656 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions toagainst or in favour of removeal or disableing of access to the information;
2021/07/19
Committee: JURI
Amendment 657 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b) decisions toagainst or in favour of suspendsion or terminateion of the provision of the service, in whole or in part, to the recipients;
2021/07/19
Committee: JURI
Amendment 659 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) decisions toagainst or in favour of suspendsion or terminateion of the recipients’ account.
2021/07/19
Committee: JURI
Amendment 663 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) decisions against or in favour of demonetising content provided by the recipients;
2021/07/19
Committee: JURI
Amendment 665 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
(cb) decisions against or in favour of applying additional labels or information to content provided by the recipients;
2021/07/19
Committee: JURI
Amendment 668 #
Proposal for a regulation
Article 17 – paragraph 1 – point c c (new)
(cc) decisions that adversely affect the recipient’s access to significant features of the platform’s regular services;
2021/07/19
Committee: JURI
Amendment 669 #
Proposal for a regulation
Article 17 – paragraph 1 – point c d (new)
(cd) decisions not to act upon a notice.
2021/07/19
Committee: JURI
Amendment 677 #
Proposal for a regulation
Article 17 – paragraph 5 a (new)
5a. Online platforms shall ensure that any relevant information in relation to decisions taken by the internal complaint- handling mechanism is available to recipients of the service for the purpose of seeking redress through an out-of-court dispute settlement body pursuant to Article 18 or before a court.
2021/07/19
Committee: JURI
Amendment 681 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
The first subparagraph is without prejudice to the right of the recipient concerned to redress against the decision before a court in accordance with the applicable law. Judicial redress against a decision by an out-of-court dispute settlement body shall be directed against the online platform, not the settlement body.
2021/07/19
Committee: JURI
Amendment 682 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or thewith a law of a Member State where it is in conformity with Union law, irrespective of the precise subject matter or nature of that law;
2021/07/08
Committee: IMCO
Amendment 684 #
Proposal for a regulation
Article 18 – paragraph 2 – introductory part
2. The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body, where the body has demonstrated that it meets all of the following conditions:
2021/07/19
Committee: JURI
Amendment 687 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms, including aspects such as financial resources and personnel;
2021/07/19
Committee: JURI
Amendment 696 #
Proposal for a regulation
Article 18 – paragraph 2 – point e
(e) the dispute settlement takes place in accordance with clear and fair, fair and publicly available rules of procedure.
2021/07/19
Committee: JURI
Amendment 697 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1
The Digital Services Coordinator shall, where applicable, specify in the certificate the particular issues to which the body’s expertise relates and the official language or languages of the Union in which the body is capable of settling disputes, as referred to in points (b) and (d) of the first subparagraph, respectively. Certified out-of-court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time.
2021/07/19
Committee: JURI
Amendment 701 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 2
Certified out-of-court dispute settlement bodies shall make information on the fees, or the mechanisms used to determine the fees, known to the recipient of the services and the online platform concerned before engaging in the dispute settlementpublicly available.
2021/07/19
Committee: JURI
Amendment 703 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
6a. Decisions reached by an out-of- court dispute settlement body shall not be disputable by another out-of-court dispute settlement body and the resolution of a particular dispute may only be discussed in one out-of-court dispute settlement body.
2021/07/19
Committee: JURI
Amendment 711 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices on manifestly illegal content submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
2021/07/19
Committee: JURI
Amendment 717 #
Proposal for a regulation
Article 19 – paragraph 2 – point a
(a) it has particular expertise and competence for the purposes of detecting, identifying and notifying manifestly illegal content in a designated area of expertise;
2021/07/19
Committee: JURI
Amendment 722 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective, non- commercial interests and is independent from any online platform;
2021/07/19
Committee: JURI
Amendment 724 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner and in full respect of fundamental rights such as the freedom of expression and information.
2021/07/19
Committee: JURI
Amendment 734 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) “media service provider” means the natural or legal person who has editorial responsibility for the content and services they offer, determines the manner in which it is organised, and complies with specific provisions or an audiovisual media service provider within the meaning of Article 1 paragraph 1(a) of Directive 2010/13/EU;
2021/07/08
Committee: IMCO
Amendment 736 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated noticesor incorrect notices, or notices violating recipients’ fundamental rights, through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint- handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
2021/07/19
Committee: JURI
Amendment 798 #
Proposal for a regulation
Article 7 a (new)
Article 7a Prohibition of interference with content and services offered by media service providers and press publishers 1. Intermediary service providers shall not remove, disable access to or otherwise interfere with content and services made available by media service providers, who hold the editorial responsibility and comply with provisions consistent with EU and national law or by publishers of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790. Publishers' and media service providers’ accounts shall not be suspended on the grounds of legal content and services they offer. 2. This Article shall not affect the possibility for an independent judicial or administrative authority of requiring the media service provider to terminate or prevent an infringement of applicable Union or national law.
2021/07/08
Committee: IMCO
Amendment 813 #
Proposal for a regulation
Article 23 – paragraph 1 – point a
(a) the number of disputes submitted to thecertified out-of-court dispute settlement bodies referred to in Article 18, the outcomes of the dispute settlement and the average time needed for completing the dispute settlement procedures;
2021/07/19
Committee: JURI
Amendment 818 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
— precise indication of the credentials of the relevant national judicial or administrative authority issuing the order and details of the person(s) of contact within the said authority;
2021/07/08
Committee: IMCO
Amendment 827 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
2021/07/19
Committee: JURI
Amendment 832 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
2. Online platforms that display advertising on their online interfaces shall include in the reports referred to in Article 13 the following information: (a) the number of advertisements removed, disabled, or labelled by the online platform, accompanied by a justification explaining the grounds for the decision; (b) aggregated data on the provider of the online advertisements that were removed, disabled or labelled by the online platform, including information on the advertisement published, the amount paid for the advertisement and information on the target audience, if applicable.
2021/07/19
Committee: JURI
Amendment 846 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of averagunique monthly active recipients of the service in the Union equal to or higher than 45 million on average, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
2021/07/19
Committee: JURI
Amendment 855 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services and activities, such as business model and design decisions, in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/19
Committee: JURI
Amendment 866 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to, including the respect for private and family life, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
2021/07/19
Committee: JURI
Amendment 902 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrentall systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
2021/07/19
Committee: JURI
Amendment 910 #
Proposal for a regulation
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. Very large online platforms shall designate a legal representative in each of the Member States where the provider offers its services.
2021/07/08
Committee: IMCO
Amendment 928 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in theiruse fair, non-discriminatory and transparent contract terms and conditions. T that information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set outshall be drafted in clear and unambiguous language and shall bare publicly available in an easily accessible format in a searchable archive of all the previous versions with their date of application.
2021/07/08
Committee: IMCO
Amendment 930 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. Any option based on profiling within the meaning of Article 4 (4) of Regulation (EU) 2016/679 shall never be the default setting of a recommender system.
2021/07/19
Committee: JURI
Amendment 934 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
1a. The parameters referred to in paragraph 1 shall include but not be limited to: (a) the recommendation criteria; (b) objectives of the recommender system; (c) hierarchy and weighing of different criteria if applicable; (d) the role of recipient behaviour in determining recommender system outputs if applicable;
2021/07/19
Committee: JURI
Amendment 940 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to national and Union law, the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service, in particular the freedom of expression and information, as enshrined in the Charter.
2021/07/08
Committee: IMCO
Amendment 942 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2a. Very large online platforms that use recommender systems shall allow the recipient of the service to have information presented to them in a chronological order only and, where technically possible, to use third-party recommender systems. Third-party recommender systems shall have access to the same information available to the recommender systems used by the platform, notwithstanding the platform’s obligations under Regulation (EU) 2016/679. Very large online platforms may only temporarily limit access to third- party recommender systems in case of provable abuse by the third-party provider or when justified by an immediate requirement to address a technical issue such as a serious security vulnerability.
2021/07/19
Committee: JURI
Amendment 948 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces an easily accessible and searchable repository containing the information referred to in paragraph 2, until onfive years after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/19
Committee: JURI
Amendment 951 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
2021/07/19
Committee: JURI
Amendment 956 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2b. Intermediary service providers shall notify media service providers pursuant to article 7a beforehand of any proposed changes to their general terms and conditions and to their parameters or algorithms that might affect the organisation, presentation and display of content and services. The proposed changes shall not be implemented before the expiry of a notice period that is reasonable and proportionate to the nature and extent of the proposed changes and their impact on media service providers and their contents and services. That period shall begin on the date on which the online intermediary service provider notifies the media service providers of the proposed changes. The provision of new content and services on the intermediary services before the expiry of the notice period by a media service provider shall not be considered as a conclusive or affirmative action, given that such content is of particular importance for the exercise of fundamental rights, in particular the freedom of expression and information.
2021/07/08
Committee: IMCO
Amendment 956 #
(ea) any decisions by the online platform regarding labelling, removal or disabling of online advertisements, including a justification explaining the grounds for the decision.
2021/07/19
Committee: JURI
Amendment 972 #
Proposal for a regulation
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate in an easily accessible and user-friendly format. This shall include personal data only where it is lawfully accessible by the public and without prejudice to Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 975 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests or civil society organisations representing the public interest, be independent from commercial interests, disclose the sources of funding financing their research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
2021/07/19
Committee: JURI
Amendment 977 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this Regulation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/07/19
Committee: JURI
Amendment 1006 #
Proposal for a regulation
Article 33 a (new)
Article 33a Interoperability 1. Very large online platforms shall offer interoperability of industry-standard features of their services to other online platforms by creating easily accessible application programming interfaces. 2. Very large online platforms may only temporarily limit access to interoperability features in case of provable abuse by a third-party provider or when justified by an immediate requirement to address a technical issue such as a serious security vulnerability. 3. In accordance with Union legislation on standardisation, the Commission shall request European standardisation bodies to develop the necessary technical standards for interoperability such as protocol interoperability and data interoperability and portability. 4. The Commission shall be empowered to review the implementation of these obligations by very large online platforms, adopt implementing measures specifying the nature and scope of the obligations, and provide updateable definitions of industry-standard features where necessary. 5. This Article is without prejudice to any limitations and restrictions set out in Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 1011 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
2a. Absence of agreement on voluntary industry standards shall not prevent the applicability or implementation of any measures outlined in this regulation.
2021/07/19
Committee: JURI
Amendment 1015 #
Proposal for a regulation
Article 13 a (new)
Article 13a Display of the identity of business users 1. A provider of intermediary services shall ensure that the identity of the business user providing content, goods or services is clearly visible alongside the content, goods or services offered. 2. For this purpose, a provider of intermediary services shall establish a standardized and mandatory interface for business users. A content, good or service shall only be displayed to users, if the necessary contact information is made available. 3. A provider of intermediary services shall on a regular basis conduct checks on the information provided by a business user in accordance with paragraph (2).
2021/07/08
Committee: IMCO
Amendment 1017 #
Proposal for a regulation
Article 13 a (new)
Article 13a Display of the identity of traders Intermediary service providers shall ensure that the identity, such as the trademark or logo or other characteristic traits, of the provider providing content, goods or services on the intermediary services is clearly visible alongside the content, goods or services offered.
2021/07/08
Committee: IMCO
Amendment 1019 #
Proposal for a regulation
Article 13 b (new)
Article 13b Targeted advertising Providers of intermediary services shall not collect or use personal data of a service recipient for the purpose of targeting or tailoring digital advertising. If a service provider legitimately receives information that allows it to make assumptions about the physical, physiological, genetic, mental, economic, cultural or social identity of a user, this information shall not be used for advertising purposes, specifically not for targeting or tailoring of advertising.
2021/07/08
Committee: IMCO
Amendment 1029 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means, for example through online web forms.
2021/07/08
Committee: IMCO
Amendment 1051 #
3. Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law or the allocation of additional powers under other applicable law.
2021/07/19
Committee: JURI
Amendment 1059 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. A provider shall be exempted from liability despite knowledge for a time that is appropriate to take an informed decision on the matter.
2021/07/08
Committee: IMCO
Amendment 1069 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Reporting persons within the meaning of Article 4 of Directive (EU) 2019/1937 shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the reporting person resides. Such complaints shall be treated with priority by the Digital Services Coordinator and shall, where appropriate, be transmitted to the Digital Service Coordinator of the establishment of the provider of the intermediary service concerned.
2021/07/19
Committee: JURI
Amendment 1078 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Where an online platform that allows consumers to conclude distance contracts with traders, detects and identifies illegal goods or services, it shall be obliged to establish an internal database of those goods and services that had previously been taken down by the online platform because they had been found to be illegal or harmful. They shall, under the inclusion of elements listed in the Rapid Exchange of Information System (RAPEX) and other relevant public databases, scan their database on a daily basis to detect illegal goods and services. If this process detects a good or service that has previously been found to be illegal or harmful, the online platform shall be obliged to delete the content expeditiously.
2021/07/08
Committee: IMCO
Amendment 1079 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Where the explanation of the reasons as referred to in paragraph 2 (a) does not allow a diligent economic operator to identify the illegality of the content in question; where the notified content is not illegal in the country of establishment of the hosting service; or, where there is a genuine demonstrable doubt about the illegality of the content, the hosting services may seek assistance for further clarification with the relevant authority or the national Digital Services Coordinator;
2021/07/08
Committee: IMCO
Amendment 1132 #
Proposal for a regulation
Article 15 a (new)
Article 15a Online interface design and organisation 1. Providers of hosting services shall not distort or impair consumers’ ability to make an informed decision via the structure, function or manner of operation of their online interface or a part thereof. 2. Providers of hosting services shall design and organise their online interface in a way that enables themselves and traders to comply with their obligations under applicable Union and Member State law on consumer protection, including on product safety.
2021/07/08
Committee: IMCO
Amendment 1145 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the provider of the online platform not to act upon the receipt of a notice or against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/07/08
Committee: IMCO
Amendment 1152 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions whether or not to remove or disable access to or restrict visibility of the information;
2021/07/08
Committee: IMCO
Amendment 1159 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
2021/07/08
Committee: IMCO
Amendment 1163 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) decisions whether or not to suspend or terminate the recipients’ account.
2021/07/08
Committee: IMCO
Amendment 1200 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the bodyalways direct recipients to an out-of-court dispute settlement body. The information about the competent out-of-court body shall be easily accessible on the online interface of the online platform in a clear and an user-friendly manner.
2021/07/08
Committee: IMCO
Amendment 1205 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 2
The first subpParagraph 1 is without prejudice to the right of the recipient concerned to redress against the decision before a court in accordance with the applicable law.
2021/07/08
Committee: IMCO
Amendment 1208 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
1a. Online platforms shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
2021/07/08
Committee: IMCO
Amendment 1243 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
2a. Certified out-of-court dispute settlement bodies shall draw up annual reports listing the number of complaints received annually, the outcomes of the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes.
2021/07/08
Committee: IMCO
Amendment 1263 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, having regard to their expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
2021/07/08
Committee: IMCO
Amendment 1362 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
2a. When a platform that allows consumers to conclude distance contracts with traders becomes aware that a piece of information, a product or service poses a serious risk to the life, health or safety of consumers, it shall promptly inform the competent authorities of the Member State or Member States concerned and provide all relevant information available.
2021/07/08
Committee: IMCO
Amendment 1398 #
Proposal for a regulation
Article 22 – paragraph 1 – point f a (new)
(fa) whether the drop shipping principle is applied, i.e. goods are offered that are not in stock in the retailer's warehouse;
2021/07/08
Committee: IMCO
Amendment 1411 #
Proposal for a regulation
Article 22 – paragraph 2 a (new)
2a. Before giving access to traders to offer products or services and to display advertising on their online interfaces, the online platform shall make reasonable efforts to prevent fraudulent practices on their platform, such as offers or advertisements of fake shops operators;
2021/07/08
Committee: IMCO
Amendment 1442 #
Proposal for a regulation
Article 22 – paragraph 6
6. The online platform shall make the information referred to in points (a), (d), (e), (f) and (fg) of paragraph 1 available to the recipients of the service, in a clear, easily accessible in accordance with Directive (EU) 2019/882, in a clear and comprehensible manner.
2021/07/08
Committee: IMCO
Amendment 1447 #
Proposal for a regulation
Article 22 – paragraph 6 a (new)
6a. In order to comply with paragraph 1 point (g), web shops shall inform close to the depicted goods if their goods are part of the stock or whether a manufacturer has to be found for them first. Online marketplaces shall provide third party sellers with a dropshipping labelling tool, which they have to use if they want to be approved by the platform.
2021/07/08
Committee: IMCO
Amendment 1452 #
Proposal for a regulation
Article 22 – paragraph 7 a (new)
7a. The online platform may rely on the information provided by third party suppliers referred to in Article 6a point (b) Directive (EU) 2019/2161, unless the platform knows or ought to know, based on the available data regarding transactions on the platform, that this information is incorrect. Online platforms must take adequate measures to prevent traders from appearing on the platform as non-traders.
2021/07/08
Committee: IMCO
Amendment 1454 #
Proposal for a regulation
Article 22 – paragraph 7 b (new)
7b. An online platform is liable for damages caused to consumers by a violation of its duties in this Article;
2021/07/08
Committee: IMCO
Amendment 1455 #
Proposal for a regulation
Article 22 – paragraph 7 c (new)
7c. The online platform must inform the consumer at the earliest possible point in time and immediately before the distance contract is concluded with a third-party provider in a prominent manner that the consumer is concluding a contract with the third party and not with the online platform. If the online platform violates its duty to provide information, the consumer can also assert the rights and legal remedies arising from the distance contract against the third party for non-performance against the online platform.
2021/07/08
Committee: IMCO
Amendment 1456 #
Proposal for a regulation
Article 22 – paragraph 7 d (new)
7d. If an online platform makes misleading information about third-party providers, about goods, services or digital content offered by third-party providers or about other provisions of the distance contract, the online platform is liable for the damage that these misleading information inflicts on consumers;
2021/07/08
Committee: IMCO
Amendment 1457 #
Proposal for a regulation
Article 22 – paragraph 7 e (new)
7e. An online platform is liable for guarantees, which it gives about third party supplier or about goods, services or digital content offered by third party supplier.
2021/07/08
Committee: IMCO
Amendment 1463 #
Proposal for a regulation
Article 22 a (new)
Article 22a Duty to protect recipients of the service Operators of online platforms allowing consumers to conclude distance contracts with traders or consumers, or of very large online platforms according to Article 25, who fail to take adequate measures for the protection of the recipients of the service upon obtaining credible evidence of criminal conduct of a recipient of the service to the detriment of other recipients or evidence of the illegality of a certain product, service, commercial practice or advertising method of a third party supplier, shall be held liable for the damages caused resulting from such a failure.
2021/07/08
Committee: IMCO
Amendment 1507 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Online platforms or advertising service providers that play out advertisements shall also check the accuracy of the information about the advertiser in accordance with the due diligence obligations pursuant to Article 22. If there are indications of dubious offers - in the case of obviousness, user reports and web shops "blacklisted" on warning lists - platforms or the advertising service providers behind them may not display the advertising.
2021/07/08
Committee: IMCO
Amendment 1519 #
Proposal for a regulation
Article 24 a (new)
Article 24a Prevention measures against online fraud on platforms Member States shall promote preventive measures to reduce consumer harm caused by illegal advertising and sales practices on platforms. This includes, among other things, the establishment of information platforms that publish daily warnings about current online traps. Such initiatives are linked Union-wide via a network, financed by the Commission and supported by an EU coordinator. Host providers provide clearly visible links to these prevention pages.
2021/07/08
Committee: IMCO
Amendment 1520 #
Proposal for a regulation
Article 24 a (new)
Article 24a Right to information 1. Where an online platform becomes aware, irrespective of the means used to, of the illegal nature of a product or service offered through its services, it shall inform those recipients of the service that had acquired such product or contracted such service during the last six months about the illegality, the identity of the trader and any means of redress.
2021/07/08
Committee: IMCO
Amendment 1547 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union and shall submit a report of that risk assessment to the national competent authority of the Member State in which their legal representative is established. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1562 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for human dignity, private and family life, freedom of expression and information including the freedom and pluralism of the media, freedom of the art and science and the right to education, the prohibition of discrimination and the rights of the child, as enshrined in Articles 1, 7, 11, 13, 14, 21 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1570 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, in particular the rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1697 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
1a. The parameters used in recommender systems shall always be fair and non-discriminatory.
2021/07/08
Committee: IMCO
Amendment 1708 #
Proposal for a regulation
Article 29 a (new)
Article 29a Recommendation systems and individual or target-group specific pricing on online market places The description shall also include information on whether users are shown different prices depending on individual, as defined in Article 6 (1) ii) (ea) of Directive 2011/83/EU or target group- specific factors, in particular devices used and geographical locations. Where applicable, the platform shall make reference to these factors in a clearly visible manner.
2021/07/08
Committee: IMCO
Amendment 1737 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. The archive must be easily accessible for users and contain a complaint and reporting option for users directly addressed to the platform and the responsible advertising service provider. The requirements for notifications under Art 14 also apply to notifications and complaints about advertising content.
2021/07/08
Committee: IMCO
Amendment 1759 #
Proposal for a regulation
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate., and with an easily accessible and user-friendly mechanism to search for multiple criteria, such as those reported in accordance with the obligations set out in Articles 13 and 23
2021/07/08
Committee: IMCO
Amendment 1771 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/07/08
Committee: IMCO
Amendment 1804 #
Proposal for a regulation
Article 33 a (new)
Article 33a Algorithm accountability 1. When using automated decision- making, the very large online platform shall perform an assessment of the algorithms used. 2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: (a) the compliance with corresponding Union requirements; (b) how the algorithm is used and its impact on the provision of the service; (c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and (d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). 3. When performing its assessment, the very large online platform may seek advice from relevant national public authorities, researchers and non- governmental organisations. 4. Following the assessment, referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points a), b), c) and d) of Paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. 5. Where the very large online platform finds that the algorithm used does not comply with point (a), or (d) of paragraph 2 of this Article, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. 6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c), or (d) of paragraph 2 of this Article, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred into Paragraph 5 of this Article, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement.
2021/07/08
Committee: IMCO
Amendment 1809 #
Proposal for a regulation
Article 33 a (new)
Article 33a Interoperability 1. Very large online platforms shall provide, by creating and offering an application programming interface, options enabling the interoperability of their core services to other online platforms. 2. Application programming interfaces should be easy to use, while the processing of personal data shall only be possible in a manner that ensures appropriate security of these data. Measures under paragraph (1) may not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue delay for the provision on interoperability. 3. This Article is without prejudice to any limitations and restrictions set out in Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1811 #
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
1. The Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies, and whenever available widely-used information and communication technology standards that meet the requirements set out in Annex II of Regulation No. 1025/2012, at least for the following:
2021/07/08
Committee: IMCO
Amendment 1822 #
Proposal for a regulation
Article 34 – paragraph 1 – point e
(e) interoperability of the advertisement repositories referred to in Article 30(2), and the APIs referred to in Article 33a;
2021/07/08
Committee: IMCO
Amendment 1842 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
2a. The absence of such standards as defined in this article should not prevent the timely implementation of the measures outlined in this regulation.
2021/07/08
Committee: IMCO
Amendment 1894 #
Proposal for a regulation
Article 36 a (new)
Article 36a Codes of conduct for short-term holiday rentals 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms, short-term holiday rental providers, and relevant authorities to contribute to the proper enforcement of the authorization and registration schemes for short-term holiday rentals. 2. The Commission shall aim to ensure that the codes of conduct lead to the development of effective mechanisms for online platforms to verify and track short-term holiday rental providers’ compliance with national registration and authorization requirements. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.
2021/07/08
Committee: IMCO
Amendment 1914 #
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 2
Member States shall make publicly available through online and offline means, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted.
2021/07/08
Committee: IMCO
Amendment 1919 #
Proposal for a regulation
Article 38 a (new)
Article 38a Relation to sector-specific provisions The application of these provisions does not affect areas that are subject to sector- specific regulation and provisions. In these areas, the responsibility for enforcing the provisions lies with the competent national authorities, which are organised in European networks. Within these networks, the competent authorities shall establish suitable procedures that allow for effective coordination and consistent application and enforcement of this Regulation.
2021/07/08
Committee: IMCO
Amendment 1941 #
Proposal for a regulation
Article 41 – paragraph 1 – introductory part
1. Where needed for carrying out their tasks under this Regulation and also in order to avoid any discrepancy in the enforcement of the Digital Services Act, Digital Services Coordinators shall have at least the following powers of investigation, in respect of conduct by providers of intermediary services under the jurisdiction of their Member State:
2021/07/08
Committee: IMCO
Amendment 2050 #
Proposal for a regulation
Article 48 – paragraph 1
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator, notably representatives of European regulatory networks of independent national regulatory authorities, bodies or both, shall participate in the Board . Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them.
2021/07/08
Committee: IMCO
Amendment 2274 #
Proposal for a regulation
Article 67 – paragraph 1
1. The Commission shall establish and maintain a reliable and secure information sharing system supporting communications between Digital Services Coordinators, the Commission and the Board based on the Internal Market Information system.
2021/07/08
Committee: IMCO