BETA

99 Amendments of Paul TANG related to 2020/0361(COD)

Amendment 130 #
Proposal for a regulation
Recital 1
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks, not least cybersecurity risks, and challenges, both for individual users and for society and the economy as a whole. _________________ 25Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
2021/09/10
Committee: ECON
Amendment 133 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the right to non-discrimination, privacy, protection of personal data, freedom of expression and information, consumer protection and the freedom to conduct a business, and the right to non-discrimination.
2021/06/10
Committee: LIBE
Amendment 137 #
Proposal for a regulation
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective, risk-based and proportionate mandatory rules should be established at Union level. This Regulation provides the right conditions and competitive settings for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and, fostering interoperability and assure the possibility for new entries to penetrate the market. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated.
2021/09/10
Committee: ECON
Amendment 149 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons from a technical perspective, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/06/10
Committee: LIBE
Amendment 153 #
Proposal for a regulation
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it the provider of intermediary services has knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. The exemptions from liability established in this Regulation should not depend on uncertain notions such as an ‘active’, ‘neutral’ or ‘passive’ role of providers.
2021/06/10
Committee: LIBE
Amendment 192 #
Proposal for a regulation
Recital 4 a (new)
(4a) Online advertisement plays an important role in the online environment, including in relation to the provision of the information society services. However, certain forms of online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to creating financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, to misleading or exploitative marketing or the discriminatory display of advertising with an impact on the equal treatment and the rights of consumers. Consumers are largely unaware of the volume and granularity of the data that is being collected and used to deliver personalised and micro-targeted advertisements, and have little agency and limited ways to stop or control data exploitation. The significant reach of a few online platforms, their access to extensive datasets and participation at multiple levels of the advertising value chain has created challenges for businesses, traditional media services and other market participants seeking to advertise or develop competing advertising services. In addition to the information requirements resulting from Article 6 of Directive 2000/31/EC, stricter rules on targeted advertising and micro-targeting are needed, in favour of less intrusive forms of advertising that do not require extensive tracking of the interaction and behaviour of recipients of the service. Therefore, providers of information society services may only deliver and display online advertising to a recipient or a group of recipients of the service when this is done based on contextual information, such as keywords or metadata. Providers should not deliver and display online advertising to a recipient or a clearly identifiable group of recipients of the service that is based on personal or inferred data relating to the recipients or groups of recipients. Where providers deliver and display advertisement, they should be required to ensure that the recipients of the service have certain individualised information necessary for them to understand why and on whose behalf the advertisement is displayed, including sponsored content and paid promotion.
2021/07/08
Committee: IMCO
Amendment 202 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact, that is free of charge, and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
2021/09/10
Committee: ECON
Amendment 206 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability, validity and completeness of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consum to only allow legitimate traders or on their interested partionline interfaces. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/06/10
Committee: LIBE
Amendment 208 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contributes to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens, to intervening with individual fundamental rights, such as respect for privacy and data protection. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/06/10
Committee: LIBE
Amendment 212 #
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and canwhich is likely to cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
2021/06/10
Committee: LIBE
Amendment 217 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through advertising, recommender systems or through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to data protection, the right to non-discrimination and, the rights of the child and consumer protection. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/06/10
Committee: LIBE
Amendment 225 #
(58) Very large online platforms should deploy the necessary means to diligently and effectively mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consiincluder, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmics, recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/06/10
Committee: LIBE
Amendment 228 #
Proposal for a regulation
Recital 60
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, including through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Online platforms should not be able to use confidentiality of trade secrets as reasons to refuse access to relevant information that auditors need to perform their tasks. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
2021/06/10
Committee: LIBE
Amendment 234 #
Proposal for a regulation
Recital 62
(62) A core part of a very largemany online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient and that they do not process personal data across devices or combine citizens’ personal data across online interfaces.
2021/06/10
Committee: LIBE
Amendment 237 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contributes to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens, to intervening with individual fundamental rights, such as respect for privacy and data protection. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/09/10
Committee: ECON
Amendment 243 #
Proposal for a regulation
Recital 54
(54) Very large online platforms may cause societal and economic risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative socioeconomic impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
2021/09/10
Committee: ECON
Amendment 244 #
Proposal for a regulation
Recital 55
(55) In view of the network effects characterising the platform economy, the user base of an online platform may quickly expand and reach the dimension of a very large online platform, with the related impact on the internal market, economic actors and consumers. This may be the case in the event of exponential growth experienced in short periods of time, or by a large global presence and turnover allowing the online platform to fully exploit network effects and economies of scale and of scope. A high annual turnover or market capitalisation can in particular be an indication of fast scalability in terms of user reach. In those cases, the Digital Services Coordinator should be able to request more frequent reporting from the platform on the user base to be able to timely identify the moment at which that platform should be designated as a very large online platform for the purposes of this Regulation.
2021/09/10
Committee: ECON
Amendment 255 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, the competitive aspect of the economy, security to trade, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/09/10
Committee: ECON
Amendment 265 #
Proposal for a regulation
Recital 65
(65) Given the complexity of the functioning of the systems deployed and the systemic risks they present to society and the economy, very large online platforms should appoint compliance officers, which should have the necessary qualifications to operationalise measures and monitor the compliance with this Regulation within the platform’s organisation. Very large online platforms should ensure that the compliance officer is involved, properly and in a timely manner, in all issues which relate to this Regulation. In view of the additional risks relating to their activities and their additional obligations under this Regulation, the other transparency requirements set out in this Regulation should be complemented by additional transparency requirements applicable specifically to very large online platforms, notably to report on the risk assessments performed and subsequent measures adopted as provided by this Regulation.
2021/09/10
Committee: ECON
Amendment 282 #
Proposal for a regulation
Recital 77
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and, human resources and financial means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. Furthermore, the Digital Services Coordinator of each Member State should establish a structured working relationship with the National Competition Authorities as well as the Financial Regulatory Authorities working on their territory.
2021/09/10
Committee: ECON
Amendment 289 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the messagedirectly or indirectly promote or rank information, products or services of a legal or natural person, irrespective of whether to achieve commercial or non- commercial purposes, and displayed by an online platform on its online interface againsor parts thereof against direct or indirect remuneration specifically for promoting that information, product or service;
2021/06/10
Committee: LIBE
Amendment 292 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, rank or prioritise in its online interface specific information, products or services to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
2021/06/10
Committee: LIBE
Amendment 294 #
Proposal for a regulation
Recital 94
(94) Given the importance of very large online platforms, in view of their reach and impact, their failure to comply with the specific obligations applicable to them may affect a substantial number of recipients of the services across different Member States and may cause large societal and economic harms, while such failures may also be particularly complex to identify and address.
2021/09/10
Committee: ECON
Amendment 295 #
Proposal for a regulation
Article 2 a (new)
Article 2 a Targeting of digital advertising 1. Providers of information society services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of determining the recipients to whom advertisements are displayed. 2. This provision shall not prevent information society services from determining the recipients to whom advertisements are displayed on the basis of contextual information such as keywords, the language setting communicated by the device of the recipient or the geographical region of the recipients to whom an advertisement is displayed. 3. The use of the contextual information referred to in paragraph2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of one or more natural persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or persons.
2021/06/10
Committee: LIBE
Amendment 305 #
Proposal for a regulation
Article 1 – paragraph 2 – point a
(a) contribute to the proper functioning of the internal market for intermediary services and impacted economic actors;
2021/09/10
Committee: ECON
Amendment 316 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessatake the compulsory measures to comply with the requirements of Union law, including those set out in this Regulation.
2021/06/10
Committee: LIBE
Amendment 321 #
Proposal for a regulation
Article 7 – title
No general monitoring or, active fact- finding or automated content moderation obligations
2021/06/10
Committee: LIBE
Amendment 324 #
Proposal for a regulation
Article 7 – paragraph 1
No general obligation shall be imposed to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.
2021/06/10
Committee: LIBE
Amendment 325 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Providers of intermediary services shall not be obliged to use automated tools for content moderation.
2021/06/10
Committee: LIBE
Amendment 330 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the messagedirectly or indirectly promote or rank information, products or services of a legal or natural person, irrespective of whether to achieve commercial or non- commercial purposes, and displayed by an online platform on its online interface againsor parts thereof against direct or indirect remuneration specifically for promoting that information, product or service;
2021/09/10
Committee: ECON
Amendment 336 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
the identification details of the judicial authority issuing the order and a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed;
2021/06/10
Committee: LIBE
Amendment 371 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
— information about redress mechanisms available to the provider and to the recipients of the service concerned;
2021/06/10
Committee: LIBE
Amendment 376 #
Proposal for a regulation
Article 9 – paragraph 2 – point a a (new)
(a a) the order is securely and easily authenticated;
2021/06/10
Committee: LIBE
Amendment 379 #
Proposal for a regulation
Article 9 – paragraph 2 – point b
(b) the order only requires the provider to provide information already legally collected for the purposes of providing the service and which lies within its control;
2021/06/10
Committee: LIBE
Amendment 381 #
Proposal for a regulation
Article 13 a (new)
Article 13 a Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of showing digital advertising. 2. This provision shall not prevent intermediary services from displaying targeted digital advertising based on contextual information such as keywords, the language setting communicated by the device of the recipient or the digital location where the advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of a natural person or a clearly identifiable group of recipients/persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
2021/09/10
Committee: ECON
Amendment 383 #
Proposal for a regulation
Article 9 – paragraph 3 a (new)
3 a. The provider shall inform the recipient whose data is being sought without undue delay. As long as necessary and proportionate, in order to protect the fundamental rights of another person, the issuing judicial authority, taking into due account the impact of the request on the fundamental rights of the person whose data is sought, may request the provider to delay informing the recipient. Such a request shall be duly justified, specify the duration of the obligation of confidentiality and shall be subject to periodic review.
2021/06/10
Committee: LIBE
Amendment 387 #
Proposal for a regulation
Article 9 – paragraph 3 e (new)
3 e. The Commission shall, by means of implementing acts, establish a common European information exchange system with secure channels for the handling of authorised cross-border communications, authentication and transmission of the order referred to in paragraph 1 and, where applicable, of the requested data between the competent judicial authority and the provider.
2021/06/10
Committee: LIBE
Amendment 396 #
Proposal for a regulation
Article 14 – paragraph 2 – point d
(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete as well as the relationship, economic or otherwise, if any, the individual or entity has with the notified entity.
2021/09/10
Committee: ECON
Amendment 429 #
Proposal for a regulation
Article 13 a (new)
Article 13 a Online advertising transparency Providers of intermediary services that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear, concise and unambiguous manner and in real time: (a) that the information displayed on the interface or parts thereof is an online advertisement, including through prominent and harmonised marking; (b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement; (c) clear, meaningful and uniform information about the parameters used to determine the recipient to whom the advertisement is displayed; and (e) if the advertisement was displayed using an automated tool and the identity of the person responsible for that tool. 2. The Commission shall adopt an implementing act establishing harmonised specifications for the marking referred to in paragraph 1(a)of this Article. 3. Providers of intermediary services shall inform the natural or legal person on whose behalf the advertisement is displayed where the advertisement has been displayed. They shall also inform public authorities, upon their request. 4. Providers of intermediary services that display advertising on their online interfaces shall be able to give easy access to public authorities, NGOs, and researchers, upon their request, to information related to direct and indirect payments or any other remuneration received to display the corresponding advertisement on their online interfaces.
2021/06/10
Committee: LIBE
Amendment 430 #
Proposal for a regulation
Article 13 a (new)
Article 13 a Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of showing digital advertising. 2. This provision shall not prevent intermediary services from displaying targeted digital advertising based on contextual information such as keywords, the language setting communicated by the device of the recipient or the digital location where the advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of a natural person or a clearly identifiable group of recipients/persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
2021/06/10
Committee: LIBE
Amendment 442 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
(c) the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;deleted
2021/06/10
Committee: LIBE
Amendment 461 #
Proposal for a regulation
Recital 52 a (new)
(52a) The market position of very large online platforms allows them to collect and combine enormous amounts of personal data, thereby strengthening their market position vis-a-vis smaller competitors, while at the same time incentivising other online platforms to take part in comparable data collection practices and thus creating an unfavourable environment for consumers. Therefore, the collecting and further processing of personal data for the purpose of displaying tailored advertisement should be prohibited. The selection of advertisements shown to a consumer should consequently be based on contextual information, such as language settings by the device of the user or the digital location. Besides a positive effect on privacy and data protection rights of users, the ban will increase competition on the market and will facilitate market access for smaller online platforms and privacy-friendly business models.
2021/07/08
Committee: IMCO
Amendment 465 #
Proposal for a regulation
Recital 52 b (new)
(52b) The ban on targeted advertising should not hinder contextual advertisement, such as the displaying of a car advertisement on a website presenting information from the automotive sector.
2021/07/08
Committee: IMCO
Amendment 478 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Online platforms that directly or indirectly display advertising on their online interfaces or parts thereof shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipientconsumer, in a clear, concise but meaningful, uniform and unambiguous manner and in real time:
2021/09/10
Committee: ECON
Amendment 479 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
(a) that the information displayed is an advertisement and whether the advertisement is a result of an automated mechanism, such as an advertising exchange mechanism;
2021/09/10
Committee: ECON
Amendment 480 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and who directly or indirectly finances the advertisement;
2021/09/10
Committee: ECON
Amendment 481 #
Proposal for a regulation
Article 24 – paragraph 1 – point b a (new)
(ba) whether the advertising is based on any form of targeting; and
2021/09/10
Committee: ECON
Amendment 482 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c) meaningful, granular and specific information about the main parameters used to determine the recipient to whom the advertisement is displayedtarget and display the advertisement, which allows the consumer to determine why and how the advertisement in question was shown to him or her. This information shall include categories of data that targeted forms of advertising would use to address and categorise consumers and the data platforms share with advertisers for advertising targeting purposes.
2021/09/10
Committee: ECON
Amendment 485 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
(ca) the remuneration that is given by the advertiser;
2021/09/10
Committee: ECON
Amendment 498 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall publicly disclose the rules of procedure of their internal complaint handling system in their terms and conditions and shall present them to recipients of the service in a clear, user-friendly and easily accessible manner, when willing to present a complaint.
2021/06/10
Committee: LIBE
Amendment 508 #
Proposal for a regulation
Recital 65 a (new)
(65a) Due to their market position, very large online platforms have developed an increasing influence over society’s social, economic, and political interactions. Consumers face a lock-in situation, which may lead them into accepting unfavourable terms and conditions to participate in the services provided by these very large online platforms. To restore a competitive market and to allow consumers more choices, very large online platforms should be required to setup the necessary technical access points to create interoperability for their core services, with a view to allowing competitors a fairer market access and enabling more choice for consumers, while at the same time complying with privacy, security and safety standards. These access points should create interoperability for other online platform services of the same type, without the need to convert digital content or services to ensure functionality.
2021/07/08
Committee: IMCO
Amendment 530 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/09/10
Committee: ECON
Amendment 532 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2a. Very large online platforms shall offer users the choice of recommender systems from first and third party providers. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the very large online platform of its own recommender systems.
2021/09/10
Committee: ECON
Amendment 533 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
2b. Very large online platforms may only limit access to third party recommender systems temporarily in cases of demonstrable abuse by the third party provider or when justified by an immediate requirement to address technical problems such as a serious security vulnerability.
2021/09/10
Committee: ECON
Amendment 537 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and who directly or indirectly financed the advertisement;
2021/09/10
Committee: ECON
Amendment 538 #
Proposal for a regulation
Article 30 – paragraph 2 – point c a (new)
(ca) The remuneration that has been paid by the advertiser and the remuneration the online platform earned;
2021/09/10
Committee: ECON
Amendment 540 #
Proposal for a regulation
Article 30 – paragraph 2 – point d
(d) whether the advertisement was intended to exclude or be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose;
2021/09/10
Committee: ECON
Amendment 548 #
Proposal for a regulation
Article 33 a (new)
Article 33 a Interoperability 1. By 31 December 2023 very large online platforms shall make the main functionalities of their services interoperable with other online platforms to enable cross-platform exchange of information. This obligation shall not limit, hinder or delay their ability to solve security issues. Very large online platforms shall publicly document all application programming interfaces they make available. 2. The Commission shall adopt implementing measures specifying the nature and scope of the obligations set out in paragraph 1.
2021/09/10
Committee: ECON
Amendment 557 #
Proposal for a regulation
Article 37 – paragraph 2 – introductory part
2. The Commission shall encourage and facilitate very large online platforms and, where appropriate, other online platforms, especially those exercising a dominant position, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures:
2021/09/10
Committee: ECON
Amendment 581 #
Proposal for a regulation
Article 21 – paragraph 1
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.
2021/06/10
Committee: LIBE
Amendment 581 #
Proposal for a regulation
Article 13 a (new)
Article 13a Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of displaying digital advertising to a specific recipient or group of recipients. 2. This provision shall not prevent intermediary services from displaying targeted digital advertising based on contextual information such as keywords, the language or the approximate geographical location of the recipient of the service to whom the advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if the advertisement is displayed in real time and it does not allow for the direct or, by means of combining it with other information, indirect identification of a natural person or group of persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or group of persons.
2021/07/19
Committee: JURI
Amendment 586 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it hasthere are reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period.
2021/09/10
Committee: ECON
Amendment 595 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Online platforms that directly or indirectly display advertising on their online interfaces or parts thereof shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipientconsumer, in a clear, concise but meaningful, uniform and unambiguous manner and in real time:
2021/06/10
Committee: LIBE
Amendment 596 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
(a) that the information displayed is an advertisement and whether the advertisement is a result of an automated mechanism, such as an advertising exchange mechanism;
2021/06/10
Committee: LIBE
Amendment 599 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and who directly or indirectly finances the advertisement;
2021/06/10
Committee: LIBE
Amendment 600 #
Proposal for a regulation
Article 24 – paragraph 1 – point b a (new)
(b a) whether the advertising is based on any form of targeting; and
2021/06/10
Committee: LIBE
Amendment 601 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c) meaningful, granular and specific information about the main parameters used to determine the recipient to whom the advertisement is displayed. target and display the advertisement, which allows the consumer to determine why and how the advertisement in question was shown to him or her. This information shall include categories of data that targeted forms of advertising would use to address and categorise consumers and the data platforms share with advertisers for advertising targeting purposes.
2021/06/10
Committee: LIBE
Amendment 606 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
(c a) The remuneration that is given by the advertiser;
2021/06/10
Committee: LIBE
Amendment 616 #
Proposal for a regulation
Article 25 – paragraph 3
3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union and emerging very large online platforms, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features.
2021/06/10
Committee: LIBE
Amendment 621 #
1. Very large online platforms shall aim to effectively and diligently identify, analyse and objectively assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/06/10
Committee: LIBE
Amendment 639 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place transparent, reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/06/10
Committee: LIBE
Amendment 646 #
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
(a a) appropriate technical and operational measures or capacities, such as appropriate staffing or technical means to expeditiously remove or disable access to illegal content the platform is aware of, or has received an order to act upon;
2021/06/10
Committee: LIBE
Amendment 647 #
Proposal for a regulation
Article 27 – paragraph 1 – point a b (new)
(a b) easily accessible and user-friendly mechanisms for users to report or flag allegedly illegal content, and mechanisms for user moderation;
2021/06/10
Committee: LIBE
Amendment 658 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
(e) initiating or adjusting cooperation with other online platforms and stakeholders through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively.
2021/06/10
Committee: LIBE
Amendment 670 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
(b) best practices and recommendations for very large online platforms to effectively mitigate the systemic risks identified.
2021/06/10
Committee: LIBE
Amendment 678 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to external independent audits to assess compliance with the following:
2021/06/10
Committee: LIBE
Amendment 689 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations, vetted by the Board, which:
2021/06/10
Committee: LIBE
Amendment 695 #
Proposal for a regulation
Article 28 – paragraph 3 – introductory part
3. The organisations that perform the audits shall establish an meaningful, granular, comprehensive audit report for each audit. The report shall be in writing and include at least the following:
2021/06/10
Committee: LIBE
Amendment 696 #
Proposal for a regulation
Article 28 – paragraph 3 – point d
(d) a description of the main findings drawn from the audit and a summary of the main findings;
2021/06/10
Committee: LIBE
Amendment 697 #
Proposal for a regulation
Article 28 – paragraph 3 – point d a (new)
(d a) a description of specific elements that could not be audited to the auditor’s satisfaction, and an explanation of why these elements could not be audited;
2021/06/10
Committee: LIBE
Amendment 704 #
Proposal for a regulation
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operationalshall ensure auditors have access to all relevant information to perform their duties. Very large online platforms receiving an audit report that contains evidence of wrongdoings shall ensure to apply the recommendations addressed to them with a view to take all the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
2021/06/10
Committee: LIBE
Amendment 712 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/06/10
Committee: LIBE
Amendment 714 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
1 a. The parameters referred to in paragraph 1 shall include, at a minimum: (a) the recommendation criteria used by the relevant system; (b) how these criteria are weighted against each other; (c) what goals the relevant system has been optimised for, and; (d) if applicable, explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs.
2021/06/10
Committee: LIBE
Amendment 719 #
Proposal for a regulation
Article 29 – paragraph 2
2. Where several optionOnline platforms are available pursuant to paragraph 1, very large online platformsferred to in paragraph 1 shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. Online platforms shall ensure that the option that is activated by default is not based on profiling within the meaning of Article 4(4) Regulation(EU) 2016/679.
2021/06/10
Committee: LIBE
Amendment 721 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2 a. Very large online platforms shall offer users the choice of recommender systems from first and third party providers. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the very large online platform of its own recommender systems.
2021/06/10
Committee: LIBE
Amendment 723 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
2 b. Very large online platforms may only limit access to third party recommender systems temporarily in cases of demonstrable abuse by the third party provider or when justified by an immediate requirement to address technical problems such as a serious security vulnerability.
2021/06/10
Committee: LIBE
Amendment 728 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed; and who directly or indirectly financed the advertisement;
2021/06/10
Committee: LIBE
Amendment 730 #
Proposal for a regulation
Article 30 – paragraph 2 – point c a (new)
(c a) The remuneration that has been paid by the advertiser and the remuneration the online platform earned;
2021/06/10
Committee: LIBE
Amendment 732 #
Proposal for a regulation
Article 30 – paragraph 2 – point d
(d) whether the advertisement was intended to exclude or be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose;
2021/06/10
Committee: LIBE
Amendment 746 #
Proposal for a regulation
Article 2 a (new)
Article 2a 1. Providers of information society services shall only deliver and display advertising that is based on contextual information such as keywords, language context, or the approximate geographical region of the recipient of the service to whom an advertisement is delivered or displayed. 2. The use of the contextual information referred to in paragraph 1 shall only be permissible if the advertisement is delivered in real time, that related data are not stored and that it does not involve the direct or, by means of combining it with other information, indirect identification of a natural person or group of persons, in particular by reference to an identifier such as a name, an identification number, precise location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or group of persons. 3. Providers of information society services that deliver and display advertising on their online interfaces or on third-party services shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: (a) that the information displayed is an advertisement; (b) the natural or legal person on whose behalf the advertisement is displayed; (c) detailed information about the main parameters used to determine the recipient to whom the advertisement is delivered and displayed.
2021/07/08
Committee: IMCO
Amendment 1019 #
Proposal for a regulation
Article 13 b (new)
Article 13b Targeted advertising Providers of intermediary services shall not collect or use personal data of a service recipient for the purpose of targeting or tailoring digital advertising. If a service provider legitimately receives information that allows it to make assumptions about the physical, physiological, genetic, mental, economic, cultural or social identity of a user, this information shall not be used for advertising purposes, specifically not for targeting or tailoring of advertising.
2021/07/08
Committee: IMCO
Amendment 1132 #
Proposal for a regulation
Article 15 a (new)
Article 15a Online interface design and organisation 1. Providers of hosting services shall not distort or impair consumers’ ability to make an informed decision via the structure, function or manner of operation of their online interface or a part thereof. 2. Providers of hosting services shall design and organise their online interface in a way that enables themselves and traders to comply with their obligations under applicable Union and Member State law on consumer protection, including on product safety.
2021/07/08
Committee: IMCO
Amendment 1712 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, functionable and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until onfive year after the advertisement was displayed for the last time on their online interfaces. They shall ensure multi- criterion queries can be performed per advertiser and per all data points present in the advertisement, and provide aggregated data for these queries on the amount spent, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/08
Committee: IMCO
Amendment 1737 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. The archive must be easily accessible for users and contain a complaint and reporting option for users directly addressed to the platform and the responsible advertising service provider. The requirements for notifications under Art 14 also apply to notifications and complaints about advertising content.
2021/07/08
Committee: IMCO
Amendment 1759 #
Proposal for a regulation
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate., and with an easily accessible and user-friendly mechanism to search for multiple criteria, such as those reported in accordance with the obligations set out in Articles 13 and 23
2021/07/08
Committee: IMCO
Amendment 1771 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/07/08
Committee: IMCO
Amendment 1804 #
Proposal for a regulation
Article 33 a (new)
Article 33a Algorithm accountability 1. When using automated decision- making, the very large online platform shall perform an assessment of the algorithms used. 2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: (a) the compliance with corresponding Union requirements; (b) how the algorithm is used and its impact on the provision of the service; (c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and (d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). 3. When performing its assessment, the very large online platform may seek advice from relevant national public authorities, researchers and non- governmental organisations. 4. Following the assessment, referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points a), b), c) and d) of Paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. 5. Where the very large online platform finds that the algorithm used does not comply with point (a), or (d) of paragraph 2 of this Article, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. 6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c), or (d) of paragraph 2 of this Article, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred into Paragraph 5 of this Article, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement.
2021/07/08
Committee: IMCO
Amendment 1809 #
Proposal for a regulation
Article 33 a (new)
Article 33a Interoperability 1. Very large online platforms shall provide, by creating and offering an application programming interface, options enabling the interoperability of their core services to other online platforms. 2. Application programming interfaces should be easy to use, while the processing of personal data shall only be possible in a manner that ensures appropriate security of these data. Measures under paragraph (1) may not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue delay for the provision on interoperability. 3. This Article is without prejudice to any limitations and restrictions set out in Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1894 #
Proposal for a regulation
Article 36 a (new)
Article 36a Codes of conduct for short-term holiday rentals 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms, short-term holiday rental providers, and relevant authorities to contribute to the proper enforcement of the authorization and registration schemes for short-term holiday rentals. 2. The Commission shall aim to ensure that the codes of conduct lead to the development of effective mechanisms for online platforms to verify and track short-term holiday rental providers’ compliance with national registration and authorization requirements. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.
2021/07/08
Committee: IMCO