Activities of Marcel KOLAJA related to 2020/0361(COD)
Shadow opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Amendments (208)
Amendment 127 #
Proposal for a regulation
Recital 9
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. At the same time, it respects Member States' competence to adopt and further develop laws in order to protect and promote the freedom of expression in line with the Charter of fundamental Rights of the European Union. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
Amendment 135 #
Proposal for a regulation
Recital 11
Recital 11
(11) It should be clarified that this Regulation is without prejudice to the rules of national laws implemented in line with Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected.
Amendment 141 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. For the purpose of this Regulation, the concept of “manifestly illegal content” should be defined as any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State.
Amendment 149 #
Proposal for a regulation
Recital 15 a (new)
Recital 15 a (new)
(15 a) Ensuring that providers of intermediary services can offer effective end-to-end encryption to data is essential for trust in and security of digital services in the Digital Single Market, and effectively prevents unauthorised third- party access.
Amendment 160 #
Proposal for a regulation
Recital 25
Recital 25
Amendment 165 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national judicial authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Providers of intermediary services should not be obliged to use automated tools for content moderation. Nothing in this regulation should be construed as preventing providers of intermediary services from offering end-to-end encrypted services, or make the provision of such services a cause for liability or loss of immunity.
Amendment 169 #
Proposal for a regulation
Recital 29
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
Amendment 171 #
Proposal for a regulation
Recital 32
Recital 32
(32) The orders to provide informationby judicial authorities to provide information about one or more specific suspects or serious threat to public security or orders by national authorities about a specific item of information about service providers licence numbers, address or rental, or number of nights let, regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are suspect or suspects of a serious threat to public security, identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
Amendment 172 #
Proposal for a regulation
Recital 33
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in questo provide information relate to specific items of illegal content and information, respectively,nformation where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
Amendment 173 #
Proposal for a regulation
Recital 33 a (new)
Recital 33 a (new)
(33 a) With the exception of Article 7, rules in Chapter II should not apply to online platforms that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC or as a not-for-profit service, including free and open source software projects, online encyclopaedia and educational or scientific repositories. Content managed by educational and scientific repositories are subject to various national laws and are made available in order to safeguard public interest and are meant to be re- used by students, researchers and the general public.
Amendment 178 #
Proposal for a regulation
Recital 38
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of the fundamental rights of recipients of the service and the avoidance of discriminatory, unfair or arbitrary outcomes. Terms and conditions of providers of intermediary services should respect the essential principles of human rights as enshrined in the Charter and international law, including the right to freedom of expression. The freedom and pluralism of media should be respected, to this end, Member States should ensure that editorial content providers` and media service providers` possibilities to contest decisions of online platforms or to seek judicial redress in accordance with the laws of the Member State concerned is unaffected.
Amendment 184 #
Proposal for a regulation
Recital 39
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report in a standardised and machine- readable format, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36), or as a not-for-profit service with fewer than 100.000 monthly active users.
Amendment 186 #
Proposal for a regulation
Recital 39 a (new)
Recital 39 a (new)
(39 a) Recipients of the service should be empowered to make autonomous decisions inter alia regarding the acceptance of and changes to terms and conditions, advertising practices, privacy and other settings, recommender systems when interacting with intermediary services. Dark patterns however typically exploit cognitive biases and prompt online consumers to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose. Therefore, providers of intermediary services should be prohibited from deceiving or nudging recipients of the service and from subverting or impairing the autonomy, decision- making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof (‘dark pattern’). This includes, but is not limited to, exploitative design choices to direct the user to actions that benefit the provider of intermediary services, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, repetitively requesting or pressuring the recipient to make a decision or hiding or obscuring certain options.
Amendment 188 #
Proposal for a regulation
Recital 40
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Content that has been notified and that is not manifestly illegal should remain accessible while the assessment of its legality by the judicial authority is still pending. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. Recipients of the service who provided the information to which the notice relates should be given them the opportunity to reply before taking a decision. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 189 #
Proposal for a regulation
Recital 41
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property,perty and the right to non- discrimination of parties affected by illegal content.
Amendment 192 #
Proposal for a regulation
Recital 4 a (new)
Recital 4 a (new)
(4a) Online advertisement plays an important role in the online environment, including in relation to the provision of the information society services. However, certain forms of online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to creating financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, to misleading or exploitative marketing or the discriminatory display of advertising with an impact on the equal treatment and the rights of consumers. Consumers are largely unaware of the volume and granularity of the data that is being collected and used to deliver personalised and micro-targeted advertisements, and have little agency and limited ways to stop or control data exploitation. The significant reach of a few online platforms, their access to extensive datasets and participation at multiple levels of the advertising value chain has created challenges for businesses, traditional media services and other market participants seeking to advertise or develop competing advertising services. In addition to the information requirements resulting from Article 6 of Directive 2000/31/EC, stricter rules on targeted advertising and micro-targeting are needed, in favour of less intrusive forms of advertising that do not require extensive tracking of the interaction and behaviour of recipients of the service. Therefore, providers of information society services may only deliver and display online advertising to a recipient or a group of recipients of the service when this is done based on contextual information, such as keywords or metadata. Providers should not deliver and display online advertising to a recipient or a clearly identifiable group of recipients of the service that is based on personal or inferred data relating to the recipients or groups of recipients. Where providers deliver and display advertisement, they should be required to ensure that the recipients of the service have certain individualised information necessary for them to understand why and on whose behalf the advertisement is displayed, including sponsored content and paid promotion.
Amendment 193 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 194 #
Proposal for a regulation
Recital 42 a (new)
Recital 42 a (new)
(42 a) When moderating content, mechanisms voluntarily employed by platforms should not lead to ex-ante control measures based on automated tools or upload-filtering of content. Automated tools are currently unable to differentiate illegal content from content that is legal in a given context and therefore routinely result in overblocking legal content. Human review of automated reports by service providers or their contractors does not fully solve this problem, especially if it is outsourced to private staff that lack sufficient independence, qualification and accountability. Ex-ante control should be understood as making publishing subject to an automated decision. Filtering automated content submissions such as spam should be permitted. Where automated tools are otherwise used for content moderation the provider should ensure human review and the protection of legal content.
Amendment 197 #
Proposal for a regulation
Recital 44
Recital 44
(44) Recipients of the service, including persons with disabilities, should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
Amendment 202 #
Proposal for a regulation
Recital 46
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
Amendment 206 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequentpeatedly submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and nNotices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 211 #
Proposal for a regulation
Recital 48
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
Amendment 218 #
Proposal for a regulation
Recital 56
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from theimpact of functioning and use of their service, as well as by potential misuses by the recipients of the service on fundamental rights, and take appropriate mitigating measures.
Amendment 219 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risksadverse impact should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of manifestly illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, including freedom and pluralism of media, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the malfunctioning or intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditionsfundamental rights. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated exploitation of the service.
Amendment 226 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic riskadverse impacts identified in the riskfundamental rights impact assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic riskadverse impacts. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effectsadverse impact identified on the fundamental rights of the recipients of the service.
Amendment 231 #
(59) Very large online platforms should, where appropriate, conduct their riskimpact assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations.
Amendment 232 #
Proposal for a regulation
Recital 60
Recital 60
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the audit. They should give the Fundamental Rights Agency and the audit or access to all relevant data necessary to perform the audit properly. AThe Fundamental Rights Agency and the auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. AThe Fundamental Rights Agency and the auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets in line with Directive(EU) 2016/943, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
Amendment 234 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. In addition, very large online platforms should offer the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties should be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems.
Amendment 239 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems by providing relevant source code and associated data that allow the detection of possible biases or threats to fundamental rights for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researcherof possible biases onr the evolution and severity of online systemic riskreats to fundamental rights are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform andin line with Directive (EU) 2016/943 and the privacy of any other parties concerned, including the recipients of the service.
Amendment 240 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65 a) Given the cross-border nature of the services at stake, EU action to harmonise accessibility requirements for very large online platforms across the internal market is necessary to avoid market fragmentation and to ensure that equal right to access and choice of those services for persons with disabilities is guaranteed. Lack of harmonised accessibility requirements for digital services can create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for very large online platforms, including their user interfaces, must be consistent with existing Union accessibility legislation, including Directive(EU) 2019/882 and Directive (EU) 2016/2102.
Amendment 242 #
Proposal for a regulation
Recital 67
Recital 67
(67) The Commission and the Board should encouragmay facilitate the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
Amendment 243 #
Amendment 245 #
Proposal for a regulation
Recital 70
Recital 70
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts shouldmay support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives.
Amendment 246 #
Proposal for a regulation
Recital 71
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of voluntary crisis protocols to coordinate a rapid, collective and cross- border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms shouldmay be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content. The Commission should also ensure that measures are in place to ensure accessibility for persons with disabilities during the implementation of crisis protocols.
Amendment 247 #
Proposal for a regulation
Recital 71 a (new)
Recital 71 a (new)
(71 a) Before initiating or facilitating the negotiation or the revision of codes of conduct, the Commission should consider the appropriateness of proposing legislation and invite the European Parliament, the Council, the Fundamental Rights Agency, the public and, where relevant, the European Data Protection Supervisor to express their opinion and publish their opinions. It should also conduct a Fundamental Rights Impact Assessment and publish the findings.
Amendment 261 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
Article 1 – paragraph 5 a (new)
5 a. This Regulation is without prejudice to national law regulating the protection and or promotion of cultural diversity and plurality of the media in conformity with Union law and the Charter of Fundamental Rights of the European Union.
Amendment 265 #
Proposal for a regulation
Article 2 – paragraph 1 – point g a (new)
Article 2 – paragraph 1 – point g a (new)
(g a) ‘manifestly illegal content’ means any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State;
Amendment 272 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
Article 2 – paragraph 1 – point q a (new)
(q a) ‘persons with disabilities’ means persons within the meaning of Article 3 (1) of Directive (EU) 2019/882;
Amendment 274 #
Proposal for a regulation
Article 6
Article 6
Amendment 277 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
Providers of intermediary services shall not be obliged to use automated tools for content moderation.
Amendment 278 #
Proposal for a regulation
Article 7 – paragraph 1 b (new)
Article 7 – paragraph 1 b (new)
No provision of this Regulation shall prevent providers of intermediary services from offering end-to-end encrypted services, or make the provision of such services a cause for liability or loss of immunity.
Amendment 282 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 284 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
Article 8 – paragraph 2 – point b a (new)
(b a) the territorial scope of an order addressed to a provider that has its main establishment or, if the provider is not established in the Union, its legal representation in another Member State is limited to the territory of the Member State issuing the order;
Amendment 285 #
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
Article 8 – paragraph 2 – point b b (new)
(b b) if addressed to a provider that has its main establishment outside the Union, the territorial scope of the order, where Union law is infringed, is limited to the territory of the Union or, where national law is infringed, to the territory of the Member State issuing the order;
Amendment 287 #
Proposal for a regulation
Article 9 – paragraph 1
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt via a secure communication channel of an order to provide a specific item of information about one or more specific individual recipients of the servicesuspect or suspects of a serious threat to public security, issued by the relevant national judicial or administrative authoritiesy on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
Amendment 289 #
Proposal for a regulation
Article 9 – paragraph 1 a (new)
Article 9 – paragraph 1 a (new)
1 a. Providers of intermediary services shall, upon receipt via a secure communication channel of an order to provide national authorities , where proportionate and strictly necessary for the enforcement of existing national, regional or local regulation, a specific item of information about service providers’ licence numbers, the address of a rental, number of nights let on the platform or number of services provided, in compliance with Regulation (EU) 2016/679.
Amendment 290 #
Proposal for a regulation
Article 9 – paragraph 2 – introductory part
Article 9 – paragraph 2 – introductory part
2. Member States shall ensure that orders referred to in paragraph 1 seek information about suspect or suspects of serious crime and meet the following conditions:
Amendment 292 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
Article 9 – paragraph 2 – point a – indent 1
— a detailed statement of reasons explaining the legal basis and the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offencetaking due account of the impact of the measures on fundamental rights;
Amendment 293 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
Article 9 – paragraph 2 – point a – indent 1 a (new)
- a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
Amendment 295 #
Proposal for a regulation
Article 9 a (new)
Article 9 a (new)
Article 9 a With the exception of Article 7, this Chapter shall not apply to online platforms that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC or as a not-for-profit service with fewer than 100.000 monthly active users.
Amendment 298 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions or modification that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear user- friendly and unambiguous language and shall be publicly available in an easily accessible formatand machine-readable format in the language in which the service is offered.
Amendment 300 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
1 a. Providers of intermediary services shall publish summary versions of their terms and conditions in a clear, user- friendly and unambiguous language, and in an easily accessible and machine- readable format. Such a summary shall include information on remedies and redress mechanisms pursuant to Articles 17 and 18, where available.
Amendment 301 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objectivecoherent, predictable, non- discriminatory, transparent, diligent, non- arbitrary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, in compliance with procedural safeguards and with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter and relevant national law.
Amendment 306 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2 a. Any restriction referred to in paragraph 1 must respect fundamental rights enshrined in the Charter and relevant national law.
Amendment 308 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2 b. Individuals who are enforcing restrictions on the basis of terms and conditions of providers of intermediary services should be given adequate initial and ongoing training on the applicable laws and international human rights standards, as well as on the action to be taken in case of conflict with the terms and conditions. Such individuals shall be provided with appropriate working conditions, including professional support, qualified psychological assistance and qualified legal advice, where relevant.
Amendment 309 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2 c. Terms and conditions of providers of intermediary services shall respect the essential principles of human rights as enshrined in the Charter and international law.
Amendment 310 #
Proposal for a regulation
Article 12 – paragraph 2 d (new)
Article 12 – paragraph 2 d (new)
2 d. Member States shall ensure that editorial content providers` and media service providers` possibilities to contest decisions of online platforms or to seek judicial redress in accordance with the laws of the Member State concerned is unaffected.
Amendment 311 #
Proposal for a regulation
Article 12 – paragraph 2 e (new)
Article 12 – paragraph 2 e (new)
2 e. Terms that do not comply with this Article shall not be binding on recipients.
Amendment 320 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Article 13 a Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation(EU) 2016/679 for the purpose of showing digital advertising. 2. This provision shall not prevent intermediary services from displaying targeted digital advertising based on contextual information such as keywords, the language setting communicated by the device of the recipient or the digital location where the advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of a natural person or a clearly identifiable group of recipients/persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
Amendment 328 #
Proposal for a regulation
Article 14 – paragraph 2 – point a a (new)
Article 14 – paragraph 2 – point a a (new)
(a a) evidence that substantiates the claim, where possible;
Amendment 329 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the exact electronic location of that information, in particularsuch as the exact URL or URLs or other identifiers where appropriate, and, where necessary, additional information enabling the identification of the alleged illegal content;
Amendment 333 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
Article 14 – paragraph 2 – point c
Amendment 334 #
Proposal for a regulation
Article 14 – paragraph 2 – point c a (new)
Article 14 – paragraph 2 – point c a (new)
(c a) where the information concerns an alleged infringement of an intellectual property right, evidence that the entity submitting the notice is the rightholder or authorised to act on behalf of the rightholder;
Amendment 335 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
Amendment 337 #
Proposal for a regulation
Article 14 – paragraph 4
Article 14 – paragraph 4
4. WThere the notice contains the name and an electronic mail address of the individual or entity that submitted it individual or entity that submitted the notice shall be given the option to provide an electronic mail address to enable, the provider of hosting services shallto promptly send a confirmation of receipt of the notice to that individual or entity. Where individuals decide to include their contact details in a notice, their anonymity towards the recipient of the service who provided the content shall be ensured, except in cases of alleged violations of personality rights or of intellectual property rights.
Amendment 339 #
Proposal for a regulation
Article 14 – paragraph 5
Article 14 – paragraph 5
5. The provider shall also, without undue delay, notify that individual or entity of its decisaction in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision.
Amendment 340 #
Proposal for a regulation
Article 14 – paragraph 5 a (new)
Article 14 – paragraph 5 a (new)
5 a. The provider of intermediary services shall also notify the recipient of the service who provided the information, where contact details are available, giving them the opportunity to reply before taking a decision, unless this would obstruct the prevention and prosecution of serious criminal offences.
Amendment 344 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6 a. Upon receipt of a valid notice, providers of hosting services shall act expeditiously to disable access to content which is manifestly illegal.
Amendment 345 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
Article 14 – paragraph 6 b (new)
6 b. Information that has been the subject of a notice and that is not manifestly illegal shall remain accessible while the assessment of its legality by the competent authority is still pending. Member States shall ensure that providers of intermediary services are not held liable for failure to remove notified information, while the assessment of legality is still pending.
Amendment 346 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
Article 14 – paragraph 6 c (new)
6 c. A decision taken pursuant to a notice submitted in accordance with Article 14(1) shall protect the rights and legitimate interests of all affected parties, in particular their fundamental rights as enshrined in the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue.
Amendment 347 #
Proposal for a regulation
Article 14 – paragraph 6 d (new)
Article 14 – paragraph 6 d (new)
6 d. The provider of hosting services shall ensure that processing of notices is undertaken by qualified individuals to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant professional support, qualified psychological assistance and legal advice.
Amendment 351 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Article 15 a Content moderation 1. Providers of hosting services shall not use ex-ante control measures based on automated tools or upload-filtering of content for content moderation. Where providers of hosting services otherwise use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are provided to staff, and that, where necessary, they are given the opportunity to seek professional support, qualified psychological assistance and qualified legal advice. This paragraph shall not apply to moderating information which has most likely been provided by automated tools. 2. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory ,diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 353 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove demote, or disable access to or impose other sanctions against the information;
Amendment 357 #
Proposal for a regulation
Article 17 – paragraph 2
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendlincluding for persons with disabilities, user-friendly non- discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner, including for persons with disabilities.
Amendment 358 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective mannernon-discriminatory and non- arbitrary manner and within seven days starting on the date on which the online platform received the complaint. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
Amendment 359 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. and are reviewed by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant, professional support, qualified psychological assistance and legal advice.
Amendment 362 #
Proposal for a regulation
Article 18 – paragraph 2 – introductory part
Article 18 – paragraph 2 – introductory part
2. The Digital Services Coordinator of the Member State where the independent out-of-court dispute settlement body is established shall, at the request of that body, certify the body for a maximum of three years, which can be renewed, where the body has demonstrated that it meets all of the following conditions:
Amendment 363 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platformsor any third party involved in dispute, provided by the online platforms and its members are remunerated in a way that is not linked to the outcome of the procedure;
Amendment 364 #
Proposal for a regulation
Article 18 – paragraph 2 – point a a (new)
Article 18 – paragraph 2 – point a a (new)
(a a) it is composed of legal experts;
Amendment 365 #
Proposal for a regulation
Article 18 – paragraph 2 – point b
Article 18 – paragraph 2 – point b
(b) iIt has the necessary expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute as well as a general understanding of law;
Amendment 366 #
Proposal for a regulation
Article 18 – paragraph 2 – point b a (new)
Article 18 – paragraph 2 – point b a (new)
(b a) the natural persons with responsibility for dispute settlement are granted a period of office of a minimum of three years to ensure the independence of their actions;
Amendment 367 #
Proposal for a regulation
Article 18 – paragraph 2 – point b b (new)
Article 18 – paragraph 2 – point b b (new)
(b b) the natural persons with responsibility for dispute settlement commit not to work for the online platform or a professional organisation or business association of which the online platform is a member for a period of three years after their position in the body has ended;
Amendment 368 #
Proposal for a regulation
Article 18 – paragraph 2 – point b c (new)
Article 18 – paragraph 2 – point b c (new)
(b c) natural persons with responsibility for dispute resolution may not have worked for an online platform or a professional organisation or business association of which the online platform is a member for a period of two years before taking up their position in the body;
Amendment 369 #
Proposal for a regulation
Article 18 – paragraph 2 – point c
Article 18 – paragraph 2 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities through electronic communication technology;
Amendment 371 #
Proposal for a regulation
Article 18 – paragraph 2 – point c a (new)
Article 18 – paragraph 2 – point c a (new)
(c a) the anonymity of the individuals involved in the settlement procedure can be guaranteed;
Amendment 372 #
Proposal for a regulation
Article 18 – paragraph 2 – point d
Article 18 – paragraph 2 – point d
(d) it is capable ofensures the settling of a dispute in a swift, efficient and cost-effective manner and in at least one official language of the Union or at the request of the recipient at least in English;
Amendment 373 #
Proposal for a regulation
Article 18 – paragraph 2 – point e
Article 18 – paragraph 2 – point e
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure. which are easily and publicly accessible;
Amendment 374 #
Proposal for a regulation
Article 18 – paragraph 2 – point e a (new)
Article 18 – paragraph 2 – point e a (new)
(e a) it ensures that a preliminary decision is taken within a period of seven days following the reception of the complaint and that the outcome of the dispute settlement is made available within a period of 90 calendar days from the date on which the body has received the complete complaint file.
Amendment 383 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices or notices regarding legal content through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 385 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
Amendment 388 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and, internal complaints- handling systems and out-of-court dispute settlement bodies referred to in Articles 14, 17 and 178, respectively, by individuals or entities or by complainants that frequentpeatedly submit notices or complaints or initiate dispute settlements that that are manifestly unfounded.
Amendment 391 #
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
Article 20 – paragraph 3 – introductory part
3. Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following:
Amendment 393 #
Proposal for a regulation
Article 20 – paragraph 3 – point c
Article 20 – paragraph 3 – point c
(c) the gravity of the misuses and its consequences in particular on the exercise of fundamental rights, regardless of the absolute numbers or relative proportion;
Amendment 394 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
Article 20 – paragraph 3 – point d a (new)
(d a) the fact that notices and complaints were submitted following the use of an automated content recognition system;
Amendment 395 #
Proposal for a regulation
Article 20 – paragraph 3 – point d b (new)
Article 20 – paragraph 3 – point d b (new)
(d b) any justification provided by the recipient of the service to provide sufficient grounds to consider that the information is not manifestly illegal.
Amendment 396 #
Proposal for a regulation
Article 20 – paragraph 4
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, with due regard to their obligations under Article 12(2) in particular as regards the applicable fundamental rights of the recipients of the service as enshrined in the Charter, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
Amendment 397 #
Proposal for a regulation
Recital 42 a (new)
Recital 42 a (new)
(42a) When moderating content, mechanisms voluntarily employed by platforms should not lead to ex-ante control measures based on automated tools or upload-filtering of content. Automated tools are currently unable to differentiate illegal content from content that is legal in a given context and therefore routinely result in over-blocking legal content. Human review of automated reports by service providers or their contractors does not fully solve this problem, especially if it is outsourced to private staff that lack sufficient independence, qualification and accountability. Ex-ante control should be understood as making publishing subject to an automated decision. Filtering automated content submissions such as spam should be permitted. Where automated tools are otherwise used for content moderation the provider should ensure human review and the protection of legal content.
Amendment 399 #
Proposal for a regulation
Article 22 – paragraph 1 – point a
Article 22 – paragraph 1 – point a
(a) the name, address provided that the trader is not self-employed or independent professional, and whose address is his or her private address, telephone number and electronic mail address of the trader;
Amendment 410 #
Proposal for a regulation
Chapter III – Section 4 – title
Chapter III – Section 4 – title
4 Additional obligations for very large online platforms to manage systemic risks
Amendment 411 #
Proposal for a regulation
Article 26 – title
Article 26 – title
Amendment 412 #
Proposal for a regulation
Article 2 b (new)
Article 2 b (new)
Amendment 413 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming fromthe impact of the functioning and use made of their services in the Union on fundamental rights. This riskimpact assessment shall be specific to their services and shall include the following systemic riskadverse impacts:
Amendment 414 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
(a) the dissemination of manifestly illegal content through their services;
Amendment 417 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, in particular the rights to respect for private and family life, freedom of expression and information including freedom and pluralism of media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectivelythe Charter;
Amendment 418 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) malfunctioning or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public securityfundamental rights.
Amendment 419 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting riskimpact assessments, very large online platforms shall take into account, in particular, howthe effects of their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of manifestly illegal content and of information that is incompatible with their terms and conditions.
Amendment 420 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
Article 26 – paragraph 2 a (new)
2 a. The outcome of the impact assessment and supporting documents shall be communicated to the Board of Digital Service Coordinators and the Digital Services Coordinator of establishment. A summary version of the impact assessment shall be made publicly available in an easily accessible format.
Amendment 421 #
Proposal for a regulation
Article 27 – title
Article 27 – title
Mitigation of riskadverse impacts
Amendment 422 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic riskadverse impacts identified pursuant to Article 26, where mitigation is possible without adversely impacting other fundamental rights. Such measures may include, where applicable:
Amendment 425 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
Article 27 – paragraph 1 – point e
Amendment 426 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1 a. The decision as to the choice of measures shall remain with the platform.
Amendment 427 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic riskadverse impacts reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 428 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to mitigate the systemic riskadverse impacts identified.
Amendment 429 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelinerecommendations on the application of paragraph 1 in relation to specific riskadverse impacts, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelinerecommendations the Commission shall organise public consultations.
Amendment 430 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 431 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:
Amendment 432 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
Amendment 433 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed the European Union Agency for Fundamental Rights. The Agency may decide to perform the audit in collaboration with by organisations which:
Amendment 434 #
Proposal for a regulation
Article 28 – paragraph 4
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
Amendment 440 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2 a. In addition to the obligations applicable to all online platforms, very large online platforms shall offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties shall be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems.
Amendment 442 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
Article 29 – paragraph 2 b (new)
2 b. Very large online platforms may only limit access to third-party recommender systems temporarily and in exceptional circumstances, when justified by an obligation under Article 18 of Directive (EU) 2020/0359 and Article 32(1)(c) of Regulation (EU) 2016/679. Such limitations shall be notified within 24 hours to affected third parties and to the Agency. The Agency may require such limitations to be removed or modified where it decides by majority vote they are unnecessary or disproportionate.
Amendment 443 #
Proposal for a regulation
Article 29 – paragraph 2 c (new)
Article 29 – paragraph 2 c (new)
2 c. Very large online platforms shall not make commercial use of any of the data that is generated or received from third parties as a result of interoperability activities for purposes other than enabling those activities. Any processing of personal data related to those activities shall comply with Regulation (EU) 2016/679, in particular Articles 6(1)(a) and 5(1)(c).
Amendment 450 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. UpWith regards to moderation and reasoned request fromcommendation systems, very large online platforms shall make publicly available and communicate to the Digital Services Coordinator of establishment and/or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in upon request access to algorithms by providing the relevant source code and associated data that allow the detection of possible biases or threats to fundamental rights including freedom of expression. When disclosing these data, very large online platforms shall have a duty of explainability and ensure close cooperation with the Digital Services Coordinator or the Commission to make moderation and recommender systems fully understandable. When a bias is detected, very large online platforms should correct it expeditiously following requirements from the Digital Services Coordinator of establishment or the Commission. Very large online platforms shall be able to demonstrate their compliance at every step of the process pursuant to this Article 26(1).
Amendment 451 #
Proposal for a regulation
Article 31 – paragraph 4
Article 31 – paragraph 4
Amendment 453 #
Proposal for a regulation
Article 31 – paragraph 5
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, in line with Directive (EU) 2016/943 and maintaining the security of their service.
Amendment 454 #
Proposal for a regulation
Article 31 – paragraph 6
Article 31 – paragraph 6
Amendment 455 #
Proposal for a regulation
Article 31 – paragraph 6 – point a
Article 31 – paragraph 6 – point a
Amendment 456 #
Proposal for a regulation
Article 31 – paragraph 6 – point b
Article 31 – paragraph 6 – point b
Amendment 457 #
Proposal for a regulation
Article 31 – paragraph 7
Article 31 – paragraph 7
Amendment 458 #
Proposal for a regulation
Article 31 – paragraph 7 – subparagraph 1
Article 31 – paragraph 7 – subparagraph 1
Amendment 461 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Amendment 462 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shall encourage andmay facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic riskadverse impacts, in accordance with Union law, in particular on competition and the protection of personal data.
Amendment 464 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic riskadverse impacts within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 465 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
Amendment 466 #
Proposal for a regulation
Article 35 – paragraph 4
Article 35 – paragraph 4
4. The Commission and the Board shallmay assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shallmay regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions.
Amendment 467 #
Proposal for a regulation
Article 35 – paragraph 5
Article 35 – paragraph 5
5. The Board shallmay regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.
Amendment 468 #
Proposal for a regulation
Article 36 – paragraph 1
Article 36 – paragraph 1
1. The Commission shall encourage andmay facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30.
Amendment 469 #
Proposal for a regulation
Article 36 – paragraph 3
Article 36 – paragraph 3
Amendment 470 #
Proposal for a regulation
Article 37 – paragraph 1
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
Amendment 471 #
Proposal for a regulation
Article 37 – paragraph 2 – introductory part
Article 37 – paragraph 2 – introductory part
2. The Commission shallmay encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures:
Amendment 472 #
Proposal for a regulation
Article 37 – paragraph 2 – point a
Article 37 – paragraph 2 – point a
(a) displaying prominent information on the crisis situation provided by Member States’ authorities or at Union level which are also accessible for persons with disabilities;
Amendment 473 #
Proposal for a regulation
Article 37 – paragraph 3
Article 37 – paragraph 3
3. The Commission may involve, as appropriate, Member States’ authorities and Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.
Amendment 474 #
Proposal for a regulation
Article 37 – paragraph 4 – point f a (new)
Article 37 – paragraph 4 – point f a (new)
(f a) measures to ensure accessibility for persons with disabilities during the implementation of crisis protocols, including by providing accessible description about these protocols;
Amendment 475 #
Proposal for a regulation
Article 37 a (new)
Article 37 a (new)
Article 37 a Accountability and transparency 1. Before initiating or facilitating the negotiation or the revision of codes of conduct, the Commission shall (a) consider the appropriateness of proposing legislation; (b) publish the elements of the code which it could propose or advocate; (c) invite the European Parliament, the Council, the Fundamental Rights Agency, the public and, where relevant, the European Data Protection Supervisor to express their opinion and publish their opinions; (d) conduct a Fundamental Rights Impact Assessment and publish the findings. 2. The Commission shall subsequently publish the elements of the envisaged code, which it intends to propose or advocate in the negotiations. It shall not propose or advocate elements, which the European Parliament or the Council object to or which have not been subject to the process set out in paragraph 1. 3. The Commission shall allow representatives of non-governmental organisations, which advocate the interests of the recipients of relevant services, the European Parliament, the Council and the Fundamental Rights Agency to observe the negotiations and to have access to all documents pertaining to them. The Commission shall offer compensation to non-profit participants. 4. The Commission shall publish codes of conduct and their parties and keep the information updated. 5. This Article shall apply, mutatis mutandis, to crisis protocols.
Amendment 498 #
Proposal for a regulation
Recital 62 a (new)
Recital 62 a (new)
(62a) Recommender systems used by very large online platforms pose a particular risk in terms of consumer choice and lock-in effects. Consequently, in addition to the obligations applicable to all online platforms, very large online platforms should offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems, including through application programming interfaces.
Amendment 581 #
Proposal for a regulation
Article 13 a (new)
Article 13 a (new)
Amendment 783 #
Proposal for a regulation
Article 6 – paragraph 1
Article 6 – paragraph 1
Amendment 797 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
No provision of this Regulation shall prevent providers of intermediary services from offering end-to-end encrypted services, or make the provision of such services a cause for liability or loss of immunity.
Amendment 895 #
Proposal for a regulation
Article 9 a (new)
Article 9 a (new)
Article 9a Exclusion for micro enterprises and not- for-profit services This Chapter shall not apply to online platforms that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC or as a not-for-profit service with fewer than 100,000 monthly active users.
Amendment 936 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall publish summary versions of their terms and conditions in clear, user- friendly and unambiguous language, and in an easily accessible and machine- readable format. Such a summary shall include information on remedies and redress mechanisms pursuant to Articles 17 and 18, where available.
Amendment 955 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Any restriction referred to in paragraph 1 must respect the fundamental rights enshrined in the Charter and relevant national law.
Amendment 957 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2b. Individuals who are enforcing restrictions on the basis of terms and conditions of providers of intermediary services shall be given adequate initial and ongoing training on the applicable laws and international human rights standards, as well as on the action to be taken in case of conflict with the terms and conditions. Such individuals shall be provided with appropriate working conditions, including professional support, qualified psychological assistance and qualified legal advice, where relevant.
Amendment 960 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall notify the recipients of the service of any change to the contract terms and conditions that can affect their rights and provide a user-friendly explanation thereof. The changes shall not be implemented before the expiry of a notice period which is reasonable and proportionate to the nature and extent of the envisaged changes and to their consequences for the recipients of the service. That notice period shall be at least 15 days from the date on which the provider of intermediary services notifies the recipients about the changes. Failure to consent to such changes should not lead to basic services becoming unavailable.
Amendment 1034 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator canmay, in some cases, identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of valid notices containing all of the following elements:
Amendment 1041 #
Proposal for a regulation
Article 14 – paragraph 2 – point a a (new)
Article 14 – paragraph 2 – point a a (new)
(aa) evidence that substantiates the claim, where possible;
Amendment 1045 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
Article 14 – paragraph 2 – point b
(b) a clear indication of the exact electronic location of that information, in particular the exact URL or URLssuch as the URL or URLs or other identifiers where appropriate, and, where necessary, additional information enabling the identification of the alleged illegal content;
Amendment 1051 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
Article 14 – paragraph 2 – point c
Amendment 1054 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
Amendment 1065 #
Proposal for a regulation
Article 14 – paragraph 4
Article 14 – paragraph 4
4. WThere the notice contains the name and an electronic mail address of the individual or entity that submitted it, individual or entity that submitted the notice shall be given the option to provide an electronic mail address to enable the provider of hosting services shallto promptly send a confirmation of receipt of the notice to that individual or entity.
Amendment 1085 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
Article 14 – paragraph 6 a (new)
6a. Upon receipt of a valid notice, providers of hosting services shall act expeditiously to disable access to content which is manifestly illegal.
Amendment 1086 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
Article 14 – paragraph 6 b (new)
6b. Information that has been the subject of a notice and that is not manifestly illegal shall remain accessible while the assessment of its legality is still pending. Member States shall ensure that providers of intermediary services are not held liable for failure to remove notified information, while the assessment of legality is still pending.
Amendment 1090 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
Article 14 – paragraph 6 c (new)
6c. A decision taken pursuant to a notice submitted in accordance with Article 14(1) shall protect the rights and legitimate interests of all affected parties, in particular their fundamental rights as enshrined in the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue.
Amendment 1091 #
Proposal for a regulation
Article 14 – paragraph 6 d (new)
Article 14 – paragraph 6 d (new)
6d. The provider of hosting services shall ensure that processing of notices is undertaken by qualified individuals to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant professional support, qualified psychological assistance and legal advice.
Amendment 1128 #
Proposal for a regulation
Article 15 a (new)
Article 15 a (new)
Amendment 1153 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or, demote, disable access to or impose other sanctions against the information;
Amendment 1175 #
Proposal for a regulation
Article 17 – paragraph 2
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendlincluding for persons with disabilities, user-friendly, non- discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner, including for persons with disabilities.
Amendment 1180 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner, non-discriminatory and non- arbitrary manner and within seven days starting on the date on which the online platform received the complaint. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1, without undue delay.
Amendment 1190 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means and are reviewed by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant, professional support, qualified psychological assistance and legal advice..
Amendment 1215 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independent of online platforms, any third party involved in the dispute and recipients of the service provided by the online platforms;
Amendment 1328 #
Proposal for a regulation
Article 20 – paragraph 2
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and, internal complaints- handling systems and out-of-court dispute settlement bodies referred to in Articles 14, 17 and 178, respectively, by individuals or entities or by complainants that frequentpeatedly submit notices or complaints or initiate dispute settlements that are manifestly unfounded.
Amendment 1337 #
Proposal for a regulation
Article 20 – paragraph 3 – point c
Article 20 – paragraph 3 – point c
(c) the gravity of the misuses and its consequences, in particular on the exercise of fundamental rights, regardless of the absolute numbers or relative proportion;
Amendment 1342 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
Article 20 – paragraph 3 – point d a (new)
(da) the fact that notices and complaints were submitted following the use of an automated content recognition system;
Amendment 1343 #
Proposal for a regulation
Article 20 – paragraph 3 – point d b (new)
Article 20 – paragraph 3 – point d b (new)
(db) any justification provided by the recipient of the service to provide sufficient grounds to consider that the information is not manifestly illegal.
Amendment 1347 #
Proposal for a regulation
Article 20 – paragraph 4
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner with due regard to their obligations under Article 12(2) in particular as regards the applicable fundamental rights of the recipients of the service as enshrined in the Charter, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
Amendment 1518 #
Proposal for a regulation
Article 24 a (new)
Article 24 a (new)
Amendment 1544 #
Proposal for a regulation
Article 26 – title
Article 26 – title
Amendment 1553 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming fromthe impact of the functioning and use made of their services in the Union. This risk on fundamental rights, including article 38 of the Charter of Fundamental Rights of the European Union and on ensuring a high level of consumer protection. This impact assessment shall be specific to their services and shall include the following systemic riskadverse impacts:
Amendment 1558 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
(a) the dissemination of manifestly illegal content through their services;
Amendment 1571 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, including article 38 of the Charter of Fundamental Rights of The European Union and in particular the rights to respect for private and family life, freedom of expression and information, freedom of the press the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectivelythe Charter;
Amendment 1575 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) malfunctioning or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public securityfundamental rights as foreseen by the Charter of Fundamental rights of the European Union, including Article 38 on ensuring a high level of consumer protection.
Amendment 1591 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting riskimpact assessments, very large online platforms shall take into account, in particular, howthe effects of their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of manifestly illegal content and of information that is incompatible with their terms and conditions.
Amendment 1594 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
Article 26 – paragraph 2 a (new)
2a. The outcome of the impact assessment and supporting documents shall be communicated to the Board of Digital Service Coordinators and the Digital Services Coordinator of establishment. A summary version of the impact assessment shall be made publicly available in an easily accessible format.
Amendment 1598 #
Proposal for a regulation
Article 27 – title
Article 27 – title
Mitigation of riskadverse impacts
Amendment 1600 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic riskadverse impacts identified pursuant to Article 26, where mitigation is possible without adversely impacting other fundamental rights. Such measures may include, where applicable:
Amendment 1621 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
Article 27 – paragraph 1 – point e
Amendment 1629 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
Article 27 – paragraph 1 a (new)
1a. The decision as to the choice of measures shall remain with the platform.
Amendment 1635 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic riskadverse impacts reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
Amendment 1640 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to mitigate the systemic riskadverse impacts identified.
Amendment 1645 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelinerecommendations on the application of paragraph 1 in relation to specific riskimpacts, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelinerecommendations the Commission shall organise public consultations.
Amendment 1652 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:obligations set out in Chapter III.
Amendment 1655 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
Article 28 – paragraph 1 – point a
Amendment 1660 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
Amendment 1663 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by the European Union Agency for Fundamental Rights. The Agency may decide to perform the audit in collaboration with organisations which:
Amendment 1681 #
Proposal for a regulation
Article 28 – paragraph 4
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
Amendment 1703 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2a. In addition to the obligations applicable to all online platforms, very large online platforms shall offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems.
Amendment 1705 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
Article 29 – paragraph 2 b (new)
2b. Very large online platforms may only limit access to third-party recommender systems temporarily and in exceptional circumstances, when justified by an obligation under Article 18 of Directive (EU) 2020/0359 and Article 32(1)(c) of Regulation (EU) 2016/679. Such limitations shall be notified within 24 hours to affected third parties and to the Agency. The Agency may require such limitations to be removed or modified where it decides by majority vote they are unnecessary or disproportionate.
Amendment 1706 #
Proposal for a regulation
Article 29 – paragraph 2 c (new)
Article 29 – paragraph 2 c (new)
2c. Very large online platforms shall not make commercial use of any of the data that is generated or received from third parties as a result of interoperability activities for purposes other than enabling those activities. Any processing of personal data related to those activities shall comply with Regulation (EU) 2016/679, in particular Articles 6(1)(a) and 5(1)(c).
Amendment 1806 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Amendment 1850 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shall encourage andmay facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic riskadverse impacts, in accordance with Union law, in particular on competition and the protection of personal data.
Amendment 1857 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic riskadverse impacts within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1861 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
Amendment 1871 #
Proposal for a regulation
Article 35 – paragraph 4
Article 35 – paragraph 4
4. The Commission and the Board shallmay assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shallmay regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions.
Amendment 1876 #
Proposal for a regulation
Article 35 – paragraph 5
Article 35 – paragraph 5
5. The Board shallmay regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.
Amendment 1884 #
Proposal for a regulation
Article 36 – paragraph 1
Article 36 – paragraph 1
1. The Commission shallmay encourage and facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30.
Amendment 1898 #
Proposal for a regulation
Article 37 – paragraph 1
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
Amendment 1899 #
Proposal for a regulation
Article 37 – paragraph 2 – introductory part
Article 37 – paragraph 2 – introductory part
2. The Commission shallmay encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures:
Amendment 1901 #
Proposal for a regulation
Article 37 – paragraph 3
Article 37 – paragraph 3
3. The Commission may involve, as appropriate, Member States’ authorities and Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.
Amendment 1905 #
Proposal for a regulation
Article 37 a (new)
Article 37 a (new)
Article 37a Accountability and transparency 1. Before initiating or facilitating the negotiation or the revision of codes of conduct, the Commission shall (a) consider the appropriateness of proposing legislation; (b) publish the elements of the code which it could propose or advocate; (c) invite the European Parliament, the Council, the Fundamental Rights Agency, the public and, where relevant, the European Data Protection Supervisor to express their opinion and publish their opinions; (d) conduct a Fundamental Rights Impact Assessment and publish the findings. 2. The Commission shall subsequently publish the elements of the envisaged code which it intends to propose or advocate in the negotiations. It shall not propose or advocate elements which the European Parliament or the Council object to or which have not been subject to the process set out in paragraph 1. 3. The Commission shall allow representatives of non-governmental organisations which advocate the interests of the recipients of relevant services, the European Parliament, the Council and the Fundamental Rights Agency to observe the negotiations and to have access to all documents pertaining to them. The Commission shall offer compensation to non-profit participants. 4. The Commission shall publish codes of conduct and their parties and keep the information updated. 5. This Article shall apply, mutatis mutandis, to crisis protocols.