40 Amendments of Alessandra MORETTI related to 2020/0361(COD)
Amendment 31 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination based on any grounds such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation.
Amendment 32 #
Proposal for a regulation
Recital 3 a (new)
Recital 3 a (new)
(3 a) Gender equality is one of the founding values of the European Union (Articles 2 and 3(3) of the Treaty on European Union (TEU)). These objectives are also enshrined in Article 21 of the Charter of Fundamental Rights. Article 8 TFEU gives the Union the task of eliminating inequalities and promoting equality between genders in all of its activities and policies. In order to protect women's rights and tackle gender-based online violence, the principle of gender mainstreaming should be applied in all European policy, including regulating the functioning of the internal market and its digital services.
Amendment 39 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalkingsexual exploitation and abuse of women and girls, revenge porn, online stalking, online sexual harassment, the sale of non- compliant or counterfeit products, the non- authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 42 #
Proposal for a regulation
Recital 12 a (new)
Recital 12 a (new)
(12 a) Online violence is a phenomenon which needs to be addressed for the safety of all users. In specific, special attention should be paid to tackling online gender- based violence against women. An European Union Agency for Fundamental Rights' survey of 2014, the most comprehensive at EU level in the field, has shown that 1 in 10 women in the EU aged 15 or over has faced online harassment. According to The Declaration on the Elimination of Violence against Women (A/RES/48/104) violence against women is "any act of gender-based violence that results in, or is likely to result in, physical, sexual or psychological harm or suffering to women, including threats of such acts, coercion or arbitrary deprivation of liberty, whether occurring in public or in private life". Online violence against women should, therefore, be understood as any act of gender-based violence against women that is committed, assisted or aggravated in part or fully by the use of ICT, such as mobile phones and smartphones, the Internet, social media platforms or email, against a woman because she is a woman, or affects women disproportionately.
Amendment 45 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content taking into account the potential harm the illegal content in question may create. In order to ensure a harmonised implementation of illegal content removal throughout the Union, the provider should, within 24 hours, remove or disable access to illegal content that can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety. For example, sharing of images depicting child sexual abuse, content related to sexual exploitation and abuse of women and girls and revenge porn. According to the well-established case-law of the Court of Justice and in line with Directive 2000/31/EC, the concept of ‘public policy’ involves a genuine, present and sufficiently serious threat which affects one of the fundamental interests of society, in particular for the prevention, investigation, detection and prosecution of criminal offences, including the protection of minors and the fight against any incitement to hatred on grounds of race, gender, religion or nationality, and violations of human dignity concerning individual persons. The concept of ‘public security’ as interpreted by the Court of Justice covers both the internal security of a Member State, which may be affected by, inter alia, a direct threat and physical security of the population of the Member State concerned, and the external security, which may be affected by, inter alia, the risk of a serious disturbance to the foreign relations of that Member State or to the peaceful coexistence of nations. Where the illegal content does not seriously harm public policy, public security, public health or consumers’ health or safety, the provider should remove or disable access to illegal content within seven days. The deadlines referred to in this Regulation should be without prejudice to specific deadlines set out in Union law or within administrative or judicial orders. The provider may derogate from the deadlines referred to in this Regulation on the grounds of force majeure or for justifiable technical or operational reasons but it should be required to inform the competent authorities as provided for in this Regulation. The removal or disabling of access should be undertaken in the observance of the principle of the Charter of Fundamental Rights, including a high level of consumer protection and freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 46 #
Proposal for a regulation
Recital 25
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent and non-discriminatory manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
Amendment 51 #
Proposal for a regulation
Recital 34
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and, to ensure a safe and transparent online environment and to ensure the right to non-discrimination, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
Amendment 54 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. Data should be reported as disaggregated as possible. For example, anonymised individual characteristics such as gender, age group and social background of the notifying parties should be reported, whenever available. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
Amendment 55 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. Online platforms in particular may also allow for users or trusted flaggers to notify content, including their own, to which others are responding with illegal content at large, such as illegal hate speech. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 60 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and, organisations committed to notifying illegal racist and xenophobic expressions online and women’s rights organizations such as the European Women’s Lobby. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 62 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising withat can have both an impact on the equal treatment and opportunities of citizens and on the perpetuation of harmful gender stereotypes and norms. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 67 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material, content related to the sexual exploitation and abuse of women and girls, revenge porn, online stalking or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through advertising, recommender systems or through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to personal data protection, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform, which might amplify discriminatory speech and content, or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 70 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently cease, prevent and mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions to cover aspects such as online gender-based violence. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. They may also consider providing training to their staff and specifically to content moderators so they can stay up to date on covert language used as a form of illegal hate speech and violence against certain groups such as women and minorities. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers and civil society organizations that represent the interests of certain groups such as women’s rights organisations, organise training sessions and exchanges with trusted flaggerhese organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 75 #
Proposal for a regulation
Recital 59
Recital 59
(59) Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services such as consumer’s and women’s rights organizations, independent experts and civil society organisations.
Amendment 79 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. This data should be provided as disaggregated as possible in order to allow for meaningful conclusions to be drawn from it. For example, it is important that very large online platforms provided gender disaggregated data as much as possible in order for vetted researchers to have the possibility to explore whether and in what way certain online risks are experienced differently between men and women. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 80 #
Proposal for a regulation
Recital 74
Recital 74
(74) The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate. Moreover, it is important to assure that the Digital Services Coordinator, as well as other competent authorities have the necessary knowledge to guarantee the rights and obligations of this Regulation. Therefore, they should promote education and training on fundamental rights and discrimination for their staff, including training in partnership with law enforcement authorities, crisis management authorities or civil society organisations that support victims of illegal online and offline activities such as harassment, gender- based violence and illegal hate speech.
Amendment 81 #
Proposal for a regulation
Recital 82
Recital 82
(82) Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, content associated with the sexual exploitation and abuse of women and girls and revenge porn, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available.
Amendment 85 #
Proposal for a regulation
Recital 91
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities such as the European Data Protection Supervisor and the European Union Agency for Fundamental Rights under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
Amendment 94 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
Amendment 95 #
Proposal for a regulation
Article 5 – paragraph 1 a (new)
Article 5 – paragraph 1 a (new)
1 a. Without prejudice to specific deadlines set out in Union law or in administrative or legal orders, providers of hosting services shall, upon obtaining actual knowledge or awareness, remove or disable access to illegal content as soon as possible and in any event: (a) within 24 hours where the illegal content can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety; (b) within seven days where the illegal content cannot seriously harm public policy, public security, public health or consumers’ health or safety; Where the provider of hosting services cannot comply with the obligation in paragraph 1a on grounds of force majeure or for objectively justifiable technical or operational reasons, it shall, without undue delay, inform the competent authority that has issued an order pursuant to Article 8 or the recipient of the service that has submitted a notice pursuant to Article 14, of those grounds.
Amendment 99 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format, in a searchable archive of all the previous versions with their date of application.
Amendment 108 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, anonymised data on individual characteristics of those who submit these notices such as gender, age group and social background, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;.
Amendment 109 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, anonymised data on individual characteristics of those who submit these complaints, such as gender, age group and social background, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 113 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13 a Recipients' consent for advertising practices 1. Providers of intermediary services shall, by default, not make the recipients of their services subject to targeted, micro- targeted and behavioural advertisement unless the recipient of the service has expressed a freely given, specific, informed and unambiguous consent. Providers of intermediary services shall ensure that recipients of services can easily make an informed choice when expressing their consent by providing them with meaningful information, including information about the value of giving access to and about the use of their data, including an explicit mention in cases where data are collected with resource to trackers and if these data are collected in websites from other providers. 2. When asking for the consent of recipients of their services considered as vulnerable consumers, providers of intermediary services shall implement all the necessary measures to ensure that such consumers have received sufficient and relevant information before they give their consent. 3. When processing data for targeted, micro-targeted and behavioural advertising, online intermediaries shall comply with relevant Union law and shall not engage in activities that can lead to pervasive tracking, such as disproportionate combination of data collected by platforms, or disproportionate processing of special categories of data that might be used to exploit vulnerabilities. 4. Providers of intermediary services shall organise their online interface in such a way that recipients of services, in particular those considered as vulnerable consumers, can easily and efficiently access and modify advertising parameters. Providers of intermediary services shall monitor the use of advertising parameters by recipients of services on a regular basis and make best efforts to improve their awareness about the possibility to modify those parameters.
Amendment 116 #
Proposal for a regulation
Article 14 – paragraph 2 – point d a (new)
Article 14 – paragraph 2 – point d a (new)
(d a) the option for those submitting notices to outline some of their individual characteristics, such as gender, age group and social background; the providers shall make clear that this information shall not be part of the decision-making process with regard to the notice, shall be completely anonymised and used solely for reporting purposes.
Amendment 123 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
Article 17 – paragraph 1 – point c a (new)
(c a) decisions whether or not to restrict the ability to monetize content provided by the recipients.
Amendment 125 #
Proposal for a regulation
Article 17 – paragraph 4 a (new)
Article 17 – paragraph 4 a (new)
4 a. Online platforms shall give the option for those submitting complaints to outline some of their individual characteristics, such as gender, age group and social background. Online platforms shall make clear that this information shall not be part of the decision-making process in regards to the complaint, shall be completely anonymised and used solely for reporting purposes.
Amendment 126 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Amendment 134 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on online violence, the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 137 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content or content associated with online violence and gender-based violence and of information that is incompatible with their terms and conditions.
Amendment 143 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measureeasures to cease, prevent and mitigate systemic risks, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
Amendment 150 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to cease, prevent and mitigate the systemic risks identified.
Amendment 152 #
Proposal for a regulation
Article 29
Article 29
Amendment 153 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, functionable and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until onfive years after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 154 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1) and to verify the effectiveness of the risk mitigation measures taken by the very large online platform in question under Article 27.
Amendment 155 #
Proposal for a regulation
Article 31 – paragraph 3 a (new)
Article 31 – paragraph 3 a (new)
3 a. The data provided to vetted researchers should be as disaggregated as possible, unless the researcher requests it otherwise.
Amendment 156 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests or civil society organisations representing the public interest, be independent from commercial interests, disclose the funding financing the research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
Amendment 157 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Amendment 159 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission and the Board may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations, trusted flaggers and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. Trusted flaggers and vetted researchers may submit to the Commission and the Board requests for codes of conduct to be considered based on the systemic risk reports referred to in Article 13 and research evaluating the impact of the measures put in place by online platforms to address these.
Amendment 162 #
Proposal for a regulation
Article 36 a (new)
Article 36 a (new)