111 Amendments of Maria GRAPINI related to 2020/0361(COD)
Amendment 23 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective and reliable protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant and accurate information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well- informed consumer.
Amendment 29 #
Proposal for a regulation
Recital 43
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platformsand administrative requirements imposed on online platforms, especially the smaller ones, under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41, unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 49 #
Proposal for a regulation
Recital 59
Recital 59
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and, civil society organisations and consumer protection associations.
Amendment 50 #
Proposal for a regulation
Recital 61
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful, factual and objective account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
Amendment 52 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient, giving the latter a choice regarding the purchase of services and products.
Amendment 91 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiousimmediately to remove or to disable access to the illegal content.
Amendment 105 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undueany delay that cannot be properly justified, specifying the action taken and the moment when the action was taken.
Amendment 116 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order and shall do so without any delay that cannot be properly justified.
Amendment 132 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear, comprehensible and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 135 #
Proposal for a regulation
Article 14 – paragraph 5
Article 14 – paragraph 5
5. The provider shall also, without undueany delay that cannot be properly justified, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision.
Amendment 137 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, within a timely,maximum of 40 days and in a diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 148 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system within a timely,maximum of 60 days and in a diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 149 #
Proposal for a regulation
Article 17 – paragraph 4
Article 17 – paragraph 4
4. Online platforms shall inform complainants without undueany delay that cannot be properly justified of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities.
Amendment 155 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of timeperiod of at least 60 days and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
Amendment 156 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of timeperiod of at least 60 days and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
Amendment 157 #
Proposal for a regulation
Article 20 – paragraph 4 a (new)
Article 20 – paragraph 4 a (new)
4a. Platforms shall ensure that the personal data of consumers are not made public.
Amendment 185 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Article 24 – paragraph 1 – introductory part
Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner, without any possibility if deception and in real time:
Amendment 188 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights and freedoms guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, a high level of consumer protection and the right to non- discrimination.
Amendment 208 #
Proposal for a regulation
Article 28 – paragraph 4
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
Amendment 211 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation fully harmonises the rules applicable to intermediary services when dealing with illegal content online in the internal market to ensure a safe, predictable and trusted online environment where fundamental rights enshrined in the Charter are effectively protected, in order to improve the functioning of the Internal Market. Accordingly, Member States should not adopt or maintain additional national requirements on those matters falling within the scope of this Regulation, unless this would affect the direct and uniform application of the fully harmonised rules applicable to the providers of intermediary services in which are necessary to ensure the proper function of the internal market. The Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 212 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests or conflicts of interest, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 215 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on fair competition and the protection of personal data.
Amendment 217 #
Proposal for a regulation
Article 36 – paragraph 2 – introductory part
Article 36 – paragraph 2 – introductory part
2. The Commission shall aim to ensure that the codes of conduct pursue an effective and accurate transmission of information without delay, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on fair competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least:
Amendment 218 #
Proposal for a regulation
Article 36 – paragraph 3
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than sixthree months after that date.
Amendment 233 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law where that is consistentin conformity with Union law and what the precise nature or subject matter is of the law in question.
Amendment 238 #
Proposal for a regulation
Article 42 – paragraph 3
Article 42 – paragraph 3
3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 12% of the annual income or turnover of the provider concerned.
Amendment 238 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request; or interact with user generated content, or retain available technical capabilities to address the problem in most expedient and proportionate manner. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
Amendment 250 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Consequently, providers of services, such as cloud infrastructure, which are provided at the request of parties other than the content providers and only indirectly benefit the latter, should not be covered by the definition of online platforms. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 263 #
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time periodperiod of two months.
Amendment 264 #
Proposal for a regulation
Article 52 – paragraph 2
Article 52 – paragraph 2
2. When sending a simple request for information to the very large online platform concerned or other person referred to in Article 52(1), the Commission shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which the information is to be provided, and the penalties provided for in Article 59 for supplying incorrect, false or misleading information.
Amendment 265 #
Proposal for a regulation
Article 52 – paragraph 4
Article 52 – paragraph 4
4. The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). Lawyers duly authorised to act may supply the information on behalf of their clients. The latter shall remain fully responsible if the information supplied is incomplete, incorrect, false or misleading.
Amendment 269 #
Proposal for a regulation
Article 59 – paragraph 2 – point a
Article 59 – paragraph 2 – point a
(a) supply incorrect, incomplete, false or misleading information in response to a request pursuant to Article 52 or, when the information is requested by decision, fail to reply to the request within the set time period;
Amendment 271 #
Proposal for a regulation
Recital 21
Recital 21
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not select, rank or modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.
Amendment 272 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content taking into account the potential harm the illegal content in question may create. In order to ensure a harmonised implementation of illegal content removal throughout the Union, the provider should, within 24 hours, remove or disable access to illegal content that can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety. According to the well-established case-law of the Court of Justice and in line with Directive 2000/31/EC, the concept of ‘public policy’ involves a genuine, present and sufficiently serious threat which affects one of the fundamental interest of society, in particular for the prevention, investigation, detection and prosecution of criminal offences, including the protection of minors and the fight against any incitement to hatred on grounds of race, sex, religion or nationality, and violations of human dignity concerning individual persons. The concept of ‘public security’ as interpreted by the Court of Justice covers both the internal security of a Member State, which may be affected by, inter alia, a direct threat and physical security of the population of the Member State concerned, and the external security, which may be affected by, inter alia, the risk of a serous disturbance to the foreign relations of that Member State of to the peaceful coexistence of nations. Where the illegal content does not seriously harm public policy, public security, public health or consumers’ health or safety, the provider should remove or disable access to illegal content within seven days. The deadlines referred to in this Regulation should be without prejudice to specific deadlines set out Union law or within administrative or judicial orders. The provider may derogate from the deadlines referred to in this Regulation on the grounds of force majeure or for justifiable technical or operational reasons but it should be required to inform the competent authorities as provided for in this Regulation. The removal or disabling of access should be undertaken in the observance of the principle ofthe Charter of Fundamental Rights, including a high level of consumer protection and freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 302 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent mannerwith the appropriate safeguards against over-removal of legal content. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 341 #
Proposal for a regulation
Recital 33
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
Amendment 343 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market, and to ensure a safe and transparent online environment and a high level of consumer protection, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, security and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 360 #
Proposal for a regulation
Recital 37
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. In addition, recipients of intermediary services should be able to hold the legal representative liable for non-compliance.
Amendment 375 #
Proposal for a regulation
Recital 39 a (new)
Recital 39 a (new)
(39a) In order to effectively and meaningfully address the proliferation of illegal goods and services online, intermediary services should implement measures to prevent illicit content from reappearing after having been taken down. Such measures, undertaken horizontally by all intermediary services, will contribute to a safer online environment.
Amendment 415 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations, consumer organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 447 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
(50a) After having obtained the necessary contact information of a trader, which are aimed at ensuring consumer rights, a provider of intermediary services needs to verify that these details are consistently being updated and accessible for consumers. Therefore, it shall conduct regular and randomized checks on the information provided by the traders on its platform. To ensure a consistent display of these contact information, intermediary services should establish mandatory designs for the inclusion of these contact information. A content, good or service shall only be displayed after all necessary information are made available by the business user.
Amendment 461 #
Proposal for a regulation
Recital 52 a (new)
Recital 52 a (new)
(52a) The market position of very large online platforms allows them to collect and combine enormous amounts of personal data, thereby strengthening their market position vis-a-vis smaller competitors, while at the same time incentivising other online platforms to take part in comparable data collection practices and thus creating an unfavourable environment for consumers. Therefore, the collecting and further processing of personal data for the purpose of displaying tailored advertisement should be prohibited. The selection of advertisements shown to a consumer should consequently be based on contextual information, such as language settings by the device of the user or the digital location. Besides a positive effect on privacy and data protection rights of users, the ban will increase competition on the market and will facilitate market access for smaller online platforms and privacy-friendly business models.
Amendment 465 #
Proposal for a regulation
Recital 52 b (new)
Recital 52 b (new)
(52b) The ban on targeted advertising should not hinder contextual advertisement, such as the displaying of a car advertisement on a website presenting information from the automotive sector.
Amendment 481 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including unsafe, counterfeit or non-compliant products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 508 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65a) Due to their market position, very large online platforms have developed an increasing influence over society’s social, economic, and political interactions. Consumers face a lock-in situation, which may lead them into accepting unfavourable terms and conditions to participate in the services provided by these very large online platforms. To restore a competitive market and to allow consumers more choices, very large online platforms should be required to setup the necessary technical access points to create interoperability for their core services, with a view to allowing competitors a fairer market access and enabling more choice for consumers, while at the same time complying with privacy, security and safety standards. These access points should create interoperability for other online platform services of the same type, without the need to convert digital content or services to ensure functionality.
Amendment 534 #
Proposal for a regulation
Recital 73
Recital 73
(73) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be identified as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure regular reporting and effective involvement of all relevant authorities in the supervision and enforcement at Union level.
Amendment 599 #
Proposal for a regulation
Article 1 – paragraph 1 – introductory part
Article 1 – paragraph 1 – introductory part
1. This Regulation lays down harmonised rules on the provision of intermediary services in order to improve the functioning of the internal market. In particular, it establishes:
Amendment 604 #
Proposal for a regulation
Article 1 – paragraph 2 – point a
Article 1 – paragraph 2 – point a
(a) contribute to the proper functioning of the internal market for intermediary services to ensure fair competition;
Amendment 606 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including a high level of consumer protection, are effectively protected.
Amendment 640 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU) 2019/882
Amendment 644 #
Proposal for a regulation
Article 1 a (new)
Article 1 a (new)
Article 1a Contractual provisions 1. Any contractual provisions between an intermediary service provider and a trader, business user, or a recipient of its service which are contrary to this Regulation shall be unenforceable. 2. This Regulation shall apply irrespective of the law applicable to contracts concluded between providers of intermediary services and a recipient of the service, a consumer, a trader or business user.
Amendment 646 #
Proposal for a regulation
Article 1 a (new)
Article 1 a (new)
Article 1a Objective The aim of this Regulation is to contribute to the proper functioning of the internal market by setting out harmonised rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
Amendment 682 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or thewith a law of a Member State where it is in conformity with Union law, irrespective of the precise subject matter or nature of that law;
Amendment 696 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service,: (a) stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation, or (b) interacts with user generated content, or (c) retains available technical capabilities to address the problem in most expedient and proportionate manner.
Amendment 765 #
Proposal for a regulation
Article 5 – paragraph 2
Article 5 – paragraph 2
2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority, decisive influence or the control of the provider.
Amendment 777 #
Proposal for a regulation
Article 5 a (new)
Article 5 a (new)
Amendment 804 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities and addressed directly to the service provider in their country of origin, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 818 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1 a (new)
Article 8 – paragraph 2 – point a – indent 1 a (new)
— precise indication of the credentials of the relevant national judicial or administrative authority issuing the order and details of the person(s) of contact within the said authority;
Amendment 910 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. Very large online platforms shall designate a legal representative in each of the Member States where the provider offers its services.
Amendment 928 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in theiruse fair, non-discriminatory and transparent contract terms and conditions. T that information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set outshall be drafted in clear and unambiguous language and shall bare publicly available in an easily accessible format in a searchable archive of all the previous versions with their date of application.
Amendment 995 #
Proposal for a regulation
Article 13 – paragraph 1 – point d a (new)
Article 13 – paragraph 1 – point d a (new)
(da) Providers of intermediary services shall, when complying with the requirements of this Article, not be required to disclose information that, with reasonable certainty, would result in public harm through the manipulation of content moderation procedures or the disclosure of trade secrets, in line with Directive (EU) 2016/943.
Amendment 998 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
Article 13 – paragraph 1 a (new)
1a. Providers of intermediary services shall ensure that the identity, such as the trademark, logo or other characteristic traits, of the business user providing the goods, content or services on the intermediary services is clearly visible alongside the goods, content or services offered.
Amendment 1015 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 1018 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13a Measures against the reappearance of illegal content Where an intermediary service detects and identifies illegal goods or services, it shall prevent this content from reappearing on its service. The application of this requirement shall not lead to any general monitoring obligation.
Amendment 1019 #
Proposal for a regulation
Article 13 b (new)
Article 13 b (new)
Amendment 1029 #
Proposal for a regulation
Article 14 – paragraph 1
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means, for example through online web forms.
Amendment 1078 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Where an online platform that allows consumers to conclude distance contracts with traders, detects and identifies illegal goods or services, it shall be obliged to establish an internal database of those goods and services that had previously been taken down by the online platform because they had been found to be illegal or harmful. They shall, under the inclusion of elements listed in the Rapid Exchange of Information System (RAPEX) and other relevant public databases, scan their database on a daily basis to detect illegal goods and services. If this process detects a good or service that has previously been found to be illegal or harmful, the online platform shall be obliged to delete the content expeditiously.
Amendment 1079 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Where the explanation of the reasons as referred to in paragraph 2 (a) does not allow a diligent economic operator to identify the illegality of the content in question; where the notified content is not illegal in the country of establishment of the hosting service; or, where there is a genuine demonstrable doubt about the illegality of the content, the hosting services may seek assistance for further clarification with the relevant authority or the national Digital Services Coordinator;
Amendment 1132 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 1145 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the provider of the online platform not to act upon the receipt of a notice or against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 1152 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions whether or not to remove or disable access to or restrict visibility of the information;
Amendment 1159 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
Article 17 – paragraph 1 – point b
(b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
Amendment 1163 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
Article 17 – paragraph 1 – point c
(c) decisions whether or not to suspend or terminate the recipients’ account.
Amendment 1200 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the bodyalways direct recipients to an out-of-court dispute settlement body. The information about the competent out-of-court body shall be easily accessible on the online interface of the online platform in a clear and an user-friendly manner.
Amendment 1205 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 2
Article 18 – paragraph 1 – subparagraph 2
Amendment 1208 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
Article 18 – paragraph 1 a (new)
1a. Online platforms shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 1243 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
Article 18 – paragraph 2 a (new)
2a. Certified out-of-court dispute settlement bodies shall draw up annual reports listing the number of complaints received annually, the outcomes of the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes.
Amendment 1263 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, having regard to their expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
Amendment 1276 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
Article 19 – paragraph 2 – point b
(b) it is an individual rightholder or represents collective interests and is independent from any online platform;
Amendment 1324 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. OAfter having issued a prior warning, online platforms shallmay suspend, for a reasonable period of time and after having issued a prior warning,or terminate the provision of their services to recipients of the service that frequentpeatedly provide manifestly illegal content.
Amendment 1350 #
Proposal for a regulation
Article 20 a (new)
Article 20 a (new)
Article 20a Content of public interest 1. When an online platform takes the decision to remove content or to suspend the provision of its services to a recipient of the service, it shall take into account whether the content is or appears to be specifically intended to contribute to public policy objectives, in particular where the content is of particular importance to public policy, public security or public health objectives at Union or national level. 2. If an online platform decides to remove content or suspend the provision of its services to a user which is or appears to be of public interest, related to public policy, public security or public health the online platform shall take the necessary technical and organisational measures to ensure that complaints through the internal complaint-handling system referred to in Article 17, are processed and decided upon with priority and without delay.
Amendment 1362 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
Article 21 – paragraph 2 a (new)
2a. When a platform that allows consumers to conclude distance contracts with traders becomes aware that a piece of information, a product or service poses a serious risk to the life, health or safety of consumers, it shall promptly inform the competent authorities of the Member State or Member States concerned and provide all relevant information available.
Amendment 1398 #
Proposal for a regulation
Article 22 – paragraph 1 – point f a (new)
Article 22 – paragraph 1 – point f a (new)
(fa) whether the drop shipping principle is applied, i.e. goods are offered that are not in stock in the retailer's warehouse;
Amendment 1411 #
Proposal for a regulation
Article 22 – paragraph 2 a (new)
Article 22 – paragraph 2 a (new)
Amendment 1442 #
Proposal for a regulation
Article 22 – paragraph 6
Article 22 – paragraph 6
6. The online platform shall make the information referred to in points (a), (d), (e), (f) and (fg) of paragraph 1 available to the recipients of the service, in a clear, easily accessible in accordance with Directive (EU) 2019/882, in a clear and comprehensible manner.
Amendment 1447 #
Proposal for a regulation
Article 22 – paragraph 6 a (new)
Article 22 – paragraph 6 a (new)
6a. In order to comply with paragraph 1 point (g), web shops shall inform close to the depicted goods if their goods are part of the stock or whether a manufacturer has to be found for them first. Online marketplaces shall provide third party sellers with a dropshipping labelling tool, which they have to use if they want to be approved by the platform.
Amendment 1452 #
Proposal for a regulation
Article 22 – paragraph 7 a (new)
Article 22 – paragraph 7 a (new)
7a. The online platform may rely on the information provided by third party suppliers referred to in Article 6a point (b) Directive (EU) 2019/2161, unless the platform knows or ought to know, based on the available data regarding transactions on the platform, that this information is incorrect. Online platforms must take adequate measures to prevent traders from appearing on the platform as non-traders.
Amendment 1454 #
Proposal for a regulation
Article 22 – paragraph 7 b (new)
Article 22 – paragraph 7 b (new)
7b. An online platform is liable for damages caused to consumers by a violation of its duties in this Article;
Amendment 1455 #
Proposal for a regulation
Article 22 – paragraph 7 c (new)
Article 22 – paragraph 7 c (new)
7c. The online platform must inform the consumer at the earliest possible point in time and immediately before the distance contract is concluded with a third-party provider in a prominent manner that the consumer is concluding a contract with the third party and not with the online platform. If the online platform violates its duty to provide information, the consumer can also assert the rights and legal remedies arising from the distance contract against the third party for non-performance against the online platform.
Amendment 1456 #
Proposal for a regulation
Article 22 – paragraph 7 d (new)
Article 22 – paragraph 7 d (new)
7d. If an online platform makes misleading information about third-party providers, about goods, services or digital content offered by third-party providers or about other provisions of the distance contract, the online platform is liable for the damage that these misleading information inflicts on consumers;
Amendment 1457 #
Proposal for a regulation
Article 22 – paragraph 7 e (new)
Article 22 – paragraph 7 e (new)
7e. An online platform is liable for guarantees, which it gives about third party supplier or about goods, services or digital content offered by third party supplier.
Amendment 1463 #
Proposal for a regulation
Article 22 a (new)
Article 22 a (new)
Article 22a Duty to protect recipients of the service Operators of online platforms allowing consumers to conclude distance contracts with traders or consumers, or of very large online platforms according to Article 25, who fail to take adequate measures for the protection of the recipients of the service upon obtaining credible evidence of criminal conduct of a recipient of the service to the detriment of other recipients or evidence of the illegality of a certain product, service, commercial practice or advertising method of a third party supplier, shall be held liable for the damages caused resulting from such a failure.
Amendment 1507 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Article 24 – paragraph 1 a (new)
Online platforms or advertising service providers that play out advertisements shall also check the accuracy of the information about the advertiser in accordance with the due diligence obligations pursuant to Article 22. If there are indications of dubious offers - in the case of obviousness, user reports and web shops "blacklisted" on warning lists - platforms or the advertising service providers behind them may not display the advertising.
Amendment 1519 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Article 24a Prevention measures against online fraud on platforms Member States shall promote preventive measures to reduce consumer harm caused by illegal advertising and sales practices on platforms. This includes, among other things, the establishment of information platforms that publish daily warnings about current online traps. Such initiatives are linked Union-wide via a network, financed by the Commission and supported by an EU coordinator. Host providers provide clearly visible links to these prevention pages.
Amendment 1520 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Article 24a Right to information 1. Where an online platform becomes aware, irrespective of the means used to, of the illegal nature of a product or service offered through its services, it shall inform those recipients of the service that had acquired such product or contracted such service during the last six months about the illegality, the identity of the trader and any means of redress.
Amendment 1570 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, in particular the rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 1697 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1a. The parameters used in recommender systems shall always be fair and non-discriminatory.
Amendment 1708 #
Proposal for a regulation
Article 29 a (new)
Article 29 a (new)
Article 29a Recommendation systems and individual or target-group specific pricing on online market places The description shall also include information on whether users are shown different prices depending on individual, as defined in Article 6 (1) ii) (ea) of Directive 2011/83/EU or target group- specific factors, in particular devices used and geographical locations. Where applicable, the platform shall make reference to these factors in a clearly visible manner.
Amendment 1712 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, functionable and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until onfive year after the advertisement was displayed for the last time on their online interfaces. They shall ensure multi- criterion queries can be performed per advertiser and per all data points present in the advertisement, and provide aggregated data for these queries on the amount spent, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 1737 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
Article 30 – paragraph 2 a (new)
2a. The archive must be easily accessible for users and contain a complaint and reporting option for users directly addressed to the platform and the responsible advertising service provider. The requirements for notifications under Art 14 also apply to notifications and complaints about advertising content.
Amendment 1759 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate., and with an easily accessible and user-friendly mechanism to search for multiple criteria, such as those reported in accordance with the obligations set out in Articles 13 and 23
Amendment 1771 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
Amendment 1804 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33a Algorithm accountability 1. When using automated decision- making, the very large online platform shall perform an assessment of the algorithms used. 2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: (a) the compliance with corresponding Union requirements; (b) how the algorithm is used and its impact on the provision of the service; (c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and (d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). 3. When performing its assessment, the very large online platform may seek advice from relevant national public authorities, researchers and non- governmental organisations. 4. Following the assessment, referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points a), b), c) and d) of Paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. 5. Where the very large online platform finds that the algorithm used does not comply with point (a), or (d) of paragraph 2 of this Article, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. 6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c), or (d) of paragraph 2 of this Article, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred into Paragraph 5 of this Article, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement.
Amendment 1809 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33a Interoperability 1. Very large online platforms shall provide, by creating and offering an application programming interface, options enabling the interoperability of their core services to other online platforms. 2. Application programming interfaces should be easy to use, while the processing of personal data shall only be possible in a manner that ensures appropriate security of these data. Measures under paragraph (1) may not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue delay for the provision on interoperability. 3. This Article is without prejudice to any limitations and restrictions set out in Regulation (EU) 2016/679.
Amendment 1811 #
Proposal for a regulation
Article 34 – paragraph 1 – introductory part
Article 34 – paragraph 1 – introductory part
1. The Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies, and whenever available widely-used information and communication technology standards that meet the requirements set out in Annex II of Regulation No. 1025/2012, at least for the following:
Amendment 1822 #
Proposal for a regulation
Article 34 – paragraph 1 – point e
Article 34 – paragraph 1 – point e
(e) interoperability of the advertisement repositories referred to in Article 30(2), and the APIs referred to in Article 33a;
Amendment 1842 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
2a. The absence of such standards as defined in this article should not prevent the timely implementation of the measures outlined in this regulation.
Amendment 1865 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives in relation to the dissemination of illegal content, and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.
Amendment 1914 #
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 2
Article 38 – paragraph 3 – subparagraph 2
Member States shall make publicly available through online and offline means, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted.
Amendment 1941 #
Proposal for a regulation
Article 41 – paragraph 1 – introductory part
Article 41 – paragraph 1 – introductory part
1. Where needed for carrying out their tasks under this Regulation and also in order to avoid any discrepancy in the enforcement of the Digital Services Act, Digital Services Coordinators shall have at least the following powers of investigation, in respect of conduct by providers of intermediary services under the jurisdiction of their Member State:
Amendment 2274 #
Proposal for a regulation
Article 67 – paragraph 1
Article 67 – paragraph 1
1. The Commission shall establish and maintain a reliable and secure information sharing system supporting communications between Digital Services Coordinators, the Commission and the Board based on the Internal Market Information system.