BETA

Activities of Marcel KOLAJA related to 2020/0361(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
2021/10/05
Committee: CULT
Dossiers: 2020/0361(COD)
Documents: PDF(421 KB) DOC(265 KB)
Authors: [{'name': 'Sabine VERHEYEN', 'mepid': 96756}]

Amendments (208)

Amendment 127 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. At the same time, it respects Member States' competence to adopt and further develop laws in order to protect and promote the freedom of expression in line with the Charter of fundamental Rights of the European Union. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/07/23
Committee: CULT
Amendment 135 #
Proposal for a regulation
Recital 11
(11) It should be clarified that this Regulation is without prejudice to the rules of national laws implemented in line with Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected.
2021/07/23
Committee: CULT
Amendment 141 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. For the purpose of this Regulation, the concept of “manifestly illegal content” should be defined as any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State.
2021/07/23
Committee: CULT
Amendment 149 #
Proposal for a regulation
Recital 15 a (new)
(15 a) Ensuring that providers of intermediary services can offer effective end-to-end encryption to data is essential for trust in and security of digital services in the Digital Single Market, and effectively prevents unauthorised third- party access.
2021/07/23
Committee: CULT
Amendment 160 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.deleted
2021/07/23
Committee: CULT
Amendment 165 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national judicial authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Providers of intermediary services should not be obliged to use automated tools for content moderation. Nothing in this regulation should be construed as preventing providers of intermediary services from offering end-to-end encrypted services, or make the provision of such services a cause for liability or loss of immunity.
2021/07/23
Committee: CULT
Amendment 169 #
Proposal for a regulation
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
2021/07/23
Committee: CULT
Amendment 171 #
Proposal for a regulation
Recital 32
(32) The orders to provide informationby judicial authorities to provide information about one or more specific suspects or serious threat to public security or orders by national authorities about a specific item of information about service providers licence numbers, address or rental, or number of nights let, regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are suspect or suspects of a serious threat to public security, identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
2021/07/23
Committee: CULT
Amendment 172 #
Proposal for a regulation
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in questo provide information relate to specific items of illegal content and information, respectively,nformation where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
2021/07/23
Committee: CULT
Amendment 173 #
Proposal for a regulation
Recital 33 a (new)
(33 a) With the exception of Article 7, rules in Chapter II should not apply to online platforms that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC or as a not-for-profit service, including free and open source software projects, online encyclopaedia and educational or scientific repositories. Content managed by educational and scientific repositories are subject to various national laws and are made available in order to safeguard public interest and are meant to be re- used by students, researchers and the general public.
2021/07/23
Committee: CULT
Amendment 178 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of the fundamental rights of recipients of the service and the avoidance of discriminatory, unfair or arbitrary outcomes. Terms and conditions of providers of intermediary services should respect the essential principles of human rights as enshrined in the Charter and international law, including the right to freedom of expression. The freedom and pluralism of media should be respected, to this end, Member States should ensure that editorial content providers` and media service providers` possibilities to contest decisions of online platforms or to seek judicial redress in accordance with the laws of the Member State concerned is unaffected.
2021/07/23
Committee: CULT
Amendment 184 #
Proposal for a regulation
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report in a standardised and machine- readable format, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36), or as a not-for-profit service with fewer than 100.000 monthly active users.
2021/07/23
Committee: CULT
Amendment 186 #
Proposal for a regulation
Recital 39 a (new)
(39 a) Recipients of the service should be empowered to make autonomous decisions inter alia regarding the acceptance of and changes to terms and conditions, advertising practices, privacy and other settings, recommender systems when interacting with intermediary services. Dark patterns however typically exploit cognitive biases and prompt online consumers to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose. Therefore, providers of intermediary services should be prohibited from deceiving or nudging recipients of the service and from subverting or impairing the autonomy, decision- making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof (‘dark pattern’). This includes, but is not limited to, exploitative design choices to direct the user to actions that benefit the provider of intermediary services, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, repetitively requesting or pressuring the recipient to make a decision or hiding or obscuring certain options.
2021/07/23
Committee: CULT
Amendment 188 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Content that has been notified and that is not manifestly illegal should remain accessible while the assessment of its legality by the judicial authority is still pending. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. Recipients of the service who provided the information to which the notice relates should be given them the opportunity to reply before taking a decision. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
2021/07/23
Committee: CULT
Amendment 189 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property,perty and the right to non- discrimination of parties affected by illegal content.
2021/07/23
Committee: CULT
Amendment 192 #
Proposal for a regulation
Recital 4 a (new)
(4a) Online advertisement plays an important role in the online environment, including in relation to the provision of the information society services. However, certain forms of online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to creating financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, to misleading or exploitative marketing or the discriminatory display of advertising with an impact on the equal treatment and the rights of consumers. Consumers are largely unaware of the volume and granularity of the data that is being collected and used to deliver personalised and micro-targeted advertisements, and have little agency and limited ways to stop or control data exploitation. The significant reach of a few online platforms, their access to extensive datasets and participation at multiple levels of the advertising value chain has created challenges for businesses, traditional media services and other market participants seeking to advertise or develop competing advertising services. In addition to the information requirements resulting from Article 6 of Directive 2000/31/EC, stricter rules on targeted advertising and micro-targeting are needed, in favour of less intrusive forms of advertising that do not require extensive tracking of the interaction and behaviour of recipients of the service. Therefore, providers of information society services may only deliver and display online advertising to a recipient or a group of recipients of the service when this is done based on contextual information, such as keywords or metadata. Providers should not deliver and display online advertising to a recipient or a clearly identifiable group of recipients of the service that is based on personal or inferred data relating to the recipients or groups of recipients. Where providers deliver and display advertisement, they should be required to ensure that the recipients of the service have certain individualised information necessary for them to understand why and on whose behalf the advertisement is displayed, including sponsored content and paid promotion.
2021/07/08
Committee: IMCO
Amendment 193 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/07/23
Committee: CULT
Amendment 194 #
Proposal for a regulation
Recital 42 a (new)
(42 a) When moderating content, mechanisms voluntarily employed by platforms should not lead to ex-ante control measures based on automated tools or upload-filtering of content. Automated tools are currently unable to differentiate illegal content from content that is legal in a given context and therefore routinely result in overblocking legal content. Human review of automated reports by service providers or their contractors does not fully solve this problem, especially if it is outsourced to private staff that lack sufficient independence, qualification and accountability. Ex-ante control should be understood as making publishing subject to an automated decision. Filtering automated content submissions such as spam should be permitted. Where automated tools are otherwise used for content moderation the provider should ensure human review and the protection of legal content.
2021/07/23
Committee: CULT
Amendment 197 #
Proposal for a regulation
Recital 44
(44) Recipients of the service, including persons with disabilities, should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/07/23
Committee: CULT
Amendment 202 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/23
Committee: CULT
Amendment 206 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequentpeatedly submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and nNotices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/23
Committee: CULT
Amendment 211 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/07/23
Committee: CULT
Amendment 218 #
Proposal for a regulation
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from theimpact of functioning and use of their service, as well as by potential misuses by the recipients of the service on fundamental rights, and take appropriate mitigating measures.
2021/07/23
Committee: CULT
Amendment 219 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risksadverse impact should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of manifestly illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, including freedom and pluralism of media, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the malfunctioning or intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditionsfundamental rights. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated exploitation of the service.
2021/07/23
Committee: CULT
Amendment 226 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic riskadverse impacts identified in the riskfundamental rights impact assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic riskadverse impacts. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effectsadverse impact identified on the fundamental rights of the recipients of the service.
2021/07/23
Committee: CULT
Amendment 231 #
(59) Very large online platforms should, where appropriate, conduct their riskimpact assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations.
2021/07/23
Committee: CULT
Amendment 232 #
Proposal for a regulation
Recital 60
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the audit. They should give the Fundamental Rights Agency and the audit or access to all relevant data necessary to perform the audit properly. AThe Fundamental Rights Agency and the auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. AThe Fundamental Rights Agency and the auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets in line with Directive(EU) 2016/943, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
2021/07/23
Committee: CULT
Amendment 234 #
Proposal for a regulation
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. In addition, very large online platforms should offer the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties should be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems.
2021/07/23
Committee: CULT
Amendment 239 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems by providing relevant source code and associated data that allow the detection of possible biases or threats to fundamental rights for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researcherof possible biases onr the evolution and severity of online systemic riskreats to fundamental rights are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform andin line with Directive (EU) 2016/943 and the privacy of any other parties concerned, including the recipients of the service.
2021/07/23
Committee: CULT
Amendment 240 #
Proposal for a regulation
Recital 65 a (new)
(65 a) Given the cross-border nature of the services at stake, EU action to harmonise accessibility requirements for very large online platforms across the internal market is necessary to avoid market fragmentation and to ensure that equal right to access and choice of those services for persons with disabilities is guaranteed. Lack of harmonised accessibility requirements for digital services can create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for very large online platforms, including their user interfaces, must be consistent with existing Union accessibility legislation, including Directive(EU) 2019/882 and Directive (EU) 2016/2102.
2021/07/23
Committee: CULT
Amendment 242 #
Proposal for a regulation
Recital 67
(67) The Commission and the Board should encouragmay facilitate the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
2021/07/23
Committee: CULT
Amendment 243 #
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.deleted
2021/07/23
Committee: CULT
Amendment 245 #
Proposal for a regulation
Recital 70
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts shouldmay support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives.
2021/07/23
Committee: CULT
Amendment 246 #
Proposal for a regulation
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of voluntary crisis protocols to coordinate a rapid, collective and cross- border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms shouldmay be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content. The Commission should also ensure that measures are in place to ensure accessibility for persons with disabilities during the implementation of crisis protocols.
2021/07/23
Committee: CULT
Amendment 247 #
Proposal for a regulation
Recital 71 a (new)
(71 a) Before initiating or facilitating the negotiation or the revision of codes of conduct, the Commission should consider the appropriateness of proposing legislation and invite the European Parliament, the Council, the Fundamental Rights Agency, the public and, where relevant, the European Data Protection Supervisor to express their opinion and publish their opinions. It should also conduct a Fundamental Rights Impact Assessment and publish the findings.
2021/07/23
Committee: CULT
Amendment 261 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
5 a. This Regulation is without prejudice to national law regulating the protection and or promotion of cultural diversity and plurality of the media in conformity with Union law and the Charter of Fundamental Rights of the European Union.
2021/07/23
Committee: CULT
Amendment 265 #
Proposal for a regulation
Article 2 – paragraph 1 – point g a (new)
(g a) ‘manifestly illegal content’ means any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State;
2021/07/23
Committee: CULT
Amendment 272 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(q a) ‘persons with disabilities’ means persons within the meaning of Article 3 (1) of Directive (EU) 2019/882;
2021/07/23
Committee: CULT
Amendment 274 #
Proposal for a regulation
Article 6
Voluntary own-initiative investigations Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.Article 6 deleted and legal compliance
2021/07/23
Committee: CULT
Amendment 277 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Providers of intermediary services shall not be obliged to use automated tools for content moderation.
2021/07/23
Committee: CULT
Amendment 278 #
Proposal for a regulation
Article 7 – paragraph 1 b (new)
No provision of this Regulation shall prevent providers of intermediary services from offering end-to-end encrypted services, or make the provision of such services a cause for liability or loss of immunity.
2021/07/23
Committee: CULT
Amendment 282 #
Proposal for a regulation
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
2021/07/23
Committee: CULT
Amendment 284 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
(b a) the territorial scope of an order addressed to a provider that has its main establishment or, if the provider is not established in the Union, its legal representation in another Member State is limited to the territory of the Member State issuing the order;
2021/07/23
Committee: CULT
Amendment 285 #
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
(b b) if addressed to a provider that has its main establishment outside the Union, the territorial scope of the order, where Union law is infringed, is limited to the territory of the Union or, where national law is infringed, to the territory of the Member State issuing the order;
2021/07/23
Committee: CULT
Amendment 287 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt via a secure communication channel of an order to provide a specific item of information about one or more specific individual recipients of the servicesuspect or suspects of a serious threat to public security, issued by the relevant national judicial or administrative authoritiesy on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
2021/07/23
Committee: CULT
Amendment 289 #
Proposal for a regulation
Article 9 – paragraph 1 a (new)
1 a. Providers of intermediary services shall, upon receipt via a secure communication channel of an order to provide national authorities , where proportionate and strictly necessary for the enforcement of existing national, regional or local regulation, a specific item of information about service providers’ licence numbers, the address of a rental, number of nights let on the platform or number of services provided, in compliance with Regulation (EU) 2016/679.
2021/07/23
Committee: CULT
Amendment 290 #
Proposal for a regulation
Article 9 – paragraph 2 – introductory part
2. Member States shall ensure that orders referred to in paragraph 1 seek information about suspect or suspects of serious crime and meet the following conditions:
2021/07/23
Committee: CULT
Amendment 292 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a detailed statement of reasons explaining the legal basis and the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offencetaking due account of the impact of the measures on fundamental rights;
2021/07/23
Committee: CULT
Amendment 293 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
- a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
2021/07/23
Committee: CULT
Amendment 295 #
Proposal for a regulation
Article 9 a (new)
Article 9 a With the exception of Article 7, this Chapter shall not apply to online platforms that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC or as a not-for-profit service with fewer than 100.000 monthly active users.
2021/07/23
Committee: CULT
Amendment 298 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions or modification that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear user- friendly and unambiguous language and shall be publicly available in an easily accessible formatand machine-readable format in the language in which the service is offered.
2021/07/23
Committee: CULT
Amendment 300 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1 a. Providers of intermediary services shall publish summary versions of their terms and conditions in a clear, user- friendly and unambiguous language, and in an easily accessible and machine- readable format. Such a summary shall include information on remedies and redress mechanisms pursuant to Articles 17 and 18, where available.
2021/07/23
Committee: CULT
Amendment 301 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objectivecoherent, predictable, non- discriminatory, transparent, diligent, non- arbitrary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, in compliance with procedural safeguards and with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter and relevant national law.
2021/07/23
Committee: CULT
Amendment 306 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2 a. Any restriction referred to in paragraph 1 must respect fundamental rights enshrined in the Charter and relevant national law.
2021/07/23
Committee: CULT
Amendment 308 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2 b. Individuals who are enforcing restrictions on the basis of terms and conditions of providers of intermediary services should be given adequate initial and ongoing training on the applicable laws and international human rights standards, as well as on the action to be taken in case of conflict with the terms and conditions. Such individuals shall be provided with appropriate working conditions, including professional support, qualified psychological assistance and qualified legal advice, where relevant.
2021/07/23
Committee: CULT
Amendment 309 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2 c. Terms and conditions of providers of intermediary services shall respect the essential principles of human rights as enshrined in the Charter and international law.
2021/07/23
Committee: CULT
Amendment 310 #
Proposal for a regulation
Article 12 – paragraph 2 d (new)
2 d. Member States shall ensure that editorial content providers` and media service providers` possibilities to contest decisions of online platforms or to seek judicial redress in accordance with the laws of the Member State concerned is unaffected.
2021/07/23
Committee: CULT
Amendment 311 #
Proposal for a regulation
Article 12 – paragraph 2 e (new)
2 e. Terms that do not comply with this Article shall not be binding on recipients.
2021/07/23
Committee: CULT
Amendment 320 #
Proposal for a regulation
Article 13 a (new)
Article 13 a Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation(EU) 2016/679 for the purpose of showing digital advertising. 2. This provision shall not prevent intermediary services from displaying targeted digital advertising based on contextual information such as keywords, the language setting communicated by the device of the recipient or the digital location where the advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of a natural person or a clearly identifiable group of recipients/persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
2021/07/23
Committee: CULT
Amendment 328 #
Proposal for a regulation
Article 14 – paragraph 2 – point a a (new)
(a a) evidence that substantiates the claim, where possible;
2021/07/23
Committee: CULT
Amendment 329 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the exact electronic location of that information, in particularsuch as the exact URL or URLs or other identifiers where appropriate, and, where necessary, additional information enabling the identification of the alleged illegal content;
2021/07/23
Committee: CULT
Amendment 333 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
(c) the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;deleted
2021/07/23
Committee: CULT
Amendment 334 #
Proposal for a regulation
Article 14 – paragraph 2 – point c a (new)
(c a) where the information concerns an alleged infringement of an intellectual property right, evidence that the entity submitting the notice is the rightholder or authorised to act on behalf of the rightholder;
2021/07/23
Committee: CULT
Amendment 335 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.deleted
2021/07/23
Committee: CULT
Amendment 337 #
Proposal for a regulation
Article 14 – paragraph 4
4. WThere the notice contains the name and an electronic mail address of the individual or entity that submitted it individual or entity that submitted the notice shall be given the option to provide an electronic mail address to enable, the provider of hosting services shallto promptly send a confirmation of receipt of the notice to that individual or entity. Where individuals decide to include their contact details in a notice, their anonymity towards the recipient of the service who provided the content shall be ensured, except in cases of alleged violations of personality rights or of intellectual property rights.
2021/07/23
Committee: CULT
Amendment 339 #
Proposal for a regulation
Article 14 – paragraph 5
5. The provider shall also, without undue delay, notify that individual or entity of its decisaction in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision.
2021/07/23
Committee: CULT
Amendment 340 #
Proposal for a regulation
Article 14 – paragraph 5 a (new)
5 a. The provider of intermediary services shall also notify the recipient of the service who provided the information, where contact details are available, giving them the opportunity to reply before taking a decision, unless this would obstruct the prevention and prosecution of serious criminal offences.
2021/07/23
Committee: CULT
Amendment 344 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6 a. Upon receipt of a valid notice, providers of hosting services shall act expeditiously to disable access to content which is manifestly illegal.
2021/07/23
Committee: CULT
Amendment 345 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
6 b. Information that has been the subject of a notice and that is not manifestly illegal shall remain accessible while the assessment of its legality by the competent authority is still pending. Member States shall ensure that providers of intermediary services are not held liable for failure to remove notified information, while the assessment of legality is still pending.
2021/07/23
Committee: CULT
Amendment 346 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
6 c. A decision taken pursuant to a notice submitted in accordance with Article 14(1) shall protect the rights and legitimate interests of all affected parties, in particular their fundamental rights as enshrined in the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue.
2021/07/23
Committee: CULT
Amendment 347 #
Proposal for a regulation
Article 14 – paragraph 6 d (new)
6 d. The provider of hosting services shall ensure that processing of notices is undertaken by qualified individuals to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant professional support, qualified psychological assistance and legal advice.
2021/07/23
Committee: CULT
Amendment 351 #
Proposal for a regulation
Article 15 a (new)
Article 15 a Content moderation 1. Providers of hosting services shall not use ex-ante control measures based on automated tools or upload-filtering of content for content moderation. Where providers of hosting services otherwise use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are provided to staff, and that, where necessary, they are given the opportunity to seek professional support, qualified psychological assistance and qualified legal advice. This paragraph shall not apply to moderating information which has most likely been provided by automated tools. 2. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory ,diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service as enshrined in the Charter.
2021/07/23
Committee: CULT
Amendment 353 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove demote, or disable access to or impose other sanctions against the information;
2021/07/23
Committee: CULT
Amendment 357 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendlincluding for persons with disabilities, user-friendly non- discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner, including for persons with disabilities.
2021/07/23
Committee: CULT
Amendment 358 #
Proposal for a regulation
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective mannernon-discriminatory and non- arbitrary manner and within seven days starting on the date on which the online platform received the complaint. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
2021/07/23
Committee: CULT
Amendment 359 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. and are reviewed by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant, professional support, qualified psychological assistance and legal advice.
2021/07/23
Committee: CULT
Amendment 362 #
Proposal for a regulation
Article 18 – paragraph 2 – introductory part
2. The Digital Services Coordinator of the Member State where the independent out-of-court dispute settlement body is established shall, at the request of that body, certify the body for a maximum of three years, which can be renewed, where the body has demonstrated that it meets all of the following conditions:
2021/07/23
Committee: CULT
Amendment 363 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platformsor any third party involved in dispute, provided by the online platforms and its members are remunerated in a way that is not linked to the outcome of the procedure;
2021/07/23
Committee: CULT
Amendment 364 #
Proposal for a regulation
Article 18 – paragraph 2 – point a a (new)
(a a) it is composed of legal experts;
2021/07/23
Committee: CULT
Amendment 365 #
Proposal for a regulation
Article 18 – paragraph 2 – point b
(b) iIt has the necessary expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute as well as a general understanding of law;
2021/07/23
Committee: CULT
Amendment 366 #
Proposal for a regulation
Article 18 – paragraph 2 – point b a (new)
(b a) the natural persons with responsibility for dispute settlement are granted a period of office of a minimum of three years to ensure the independence of their actions;
2021/07/23
Committee: CULT
Amendment 367 #
Proposal for a regulation
Article 18 – paragraph 2 – point b b (new)
(b b) the natural persons with responsibility for dispute settlement commit not to work for the online platform or a professional organisation or business association of which the online platform is a member for a period of three years after their position in the body has ended;
2021/07/23
Committee: CULT
Amendment 368 #
Proposal for a regulation
Article 18 – paragraph 2 – point b c (new)
(b c) natural persons with responsibility for dispute resolution may not have worked for an online platform or a professional organisation or business association of which the online platform is a member for a period of two years before taking up their position in the body;
2021/07/23
Committee: CULT
Amendment 369 #
Proposal for a regulation
Article 18 – paragraph 2 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities through electronic communication technology;
2021/07/23
Committee: CULT
Amendment 371 #
Proposal for a regulation
Article 18 – paragraph 2 – point c a (new)
(c a) the anonymity of the individuals involved in the settlement procedure can be guaranteed;
2021/07/23
Committee: CULT
Amendment 372 #
Proposal for a regulation
Article 18 – paragraph 2 – point d
(d) it is capable ofensures the settling of a dispute in a swift, efficient and cost-effective manner and in at least one official language of the Union or at the request of the recipient at least in English;
2021/07/23
Committee: CULT
Amendment 373 #
Proposal for a regulation
Article 18 – paragraph 2 – point e
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure. which are easily and publicly accessible;
2021/07/23
Committee: CULT
Amendment 374 #
Proposal for a regulation
Article 18 – paragraph 2 – point e a (new)
(e a) it ensures that a preliminary decision is taken within a period of seven days following the reception of the complaint and that the outcome of the dispute settlement is made available within a period of 90 calendar days from the date on which the body has received the complete complaint file.
2021/07/23
Committee: CULT
Amendment 383 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices or notices regarding legal content through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
2021/07/23
Committee: CULT
Amendment 385 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.deleted
2021/07/23
Committee: CULT
Amendment 388 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and, internal complaints- handling systems and out-of-court dispute settlement bodies referred to in Articles 14, 17 and 178, respectively, by individuals or entities or by complainants that frequentpeatedly submit notices or complaints or initiate dispute settlements that that are manifestly unfounded.
2021/07/23
Committee: CULT
Amendment 391 #
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
3. Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following:
2021/07/23
Committee: CULT
Amendment 393 #
Proposal for a regulation
Article 20 – paragraph 3 – point c
(c) the gravity of the misuses and its consequences in particular on the exercise of fundamental rights, regardless of the absolute numbers or relative proportion;
2021/07/23
Committee: CULT
Amendment 394 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
(d a) the fact that notices and complaints were submitted following the use of an automated content recognition system;
2021/07/23
Committee: CULT
Amendment 395 #
Proposal for a regulation
Article 20 – paragraph 3 – point d b (new)
(d b) any justification provided by the recipient of the service to provide sufficient grounds to consider that the information is not manifestly illegal.
2021/07/23
Committee: CULT
Amendment 396 #
Proposal for a regulation
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, with due regard to their obligations under Article 12(2) in particular as regards the applicable fundamental rights of the recipients of the service as enshrined in the Charter, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
2021/07/23
Committee: CULT
Amendment 397 #
Proposal for a regulation
Recital 42 a (new)
(42a) When moderating content, mechanisms voluntarily employed by platforms should not lead to ex-ante control measures based on automated tools or upload-filtering of content. Automated tools are currently unable to differentiate illegal content from content that is legal in a given context and therefore routinely result in over-blocking legal content. Human review of automated reports by service providers or their contractors does not fully solve this problem, especially if it is outsourced to private staff that lack sufficient independence, qualification and accountability. Ex-ante control should be understood as making publishing subject to an automated decision. Filtering automated content submissions such as spam should be permitted. Where automated tools are otherwise used for content moderation the provider should ensure human review and the protection of legal content.
2021/07/08
Committee: IMCO
Amendment 399 #
Proposal for a regulation
Article 22 – paragraph 1 – point a
(a) the name, address provided that the trader is not self-employed or independent professional, and whose address is his or her private address, telephone number and electronic mail address of the trader;
2021/07/23
Committee: CULT
Amendment 410 #
Proposal for a regulation
Chapter III – Section 4 – title
4 Additional obligations for very large online platforms to manage systemic risks
2021/07/23
Committee: CULT
Amendment 411 #
Proposal for a regulation
Article 26 – title
RiskFundamental rights impact assessment
2021/07/23
Committee: CULT
Amendment 412 #
Proposal for a regulation
Article 2 b (new)
Article 2 b Targeting of digital advertising 1. Providers of information society services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of determining the recipients to whom advertisements are displayed. 2. This provision shall not prevent information society services from determining the recipients to whom advertisements are displayed on the basis of contextual information such as keywords, the language setting communicated by the device of the recipient or the geographical region of the recipients to whom an advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of one or more natural persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or persons.
2021/07/19
Committee: JURI
Amendment 413 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming fromthe impact of the functioning and use made of their services in the Union on fundamental rights. This riskimpact assessment shall be specific to their services and shall include the following systemic riskadverse impacts:
2021/07/23
Committee: CULT
Amendment 414 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a) the dissemination of manifestly illegal content through their services;
2021/07/23
Committee: CULT
Amendment 417 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, in particular the rights to respect for private and family life, freedom of expression and information including freedom and pluralism of media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectivelythe Charter;
2021/07/23
Committee: CULT
Amendment 418 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) malfunctioning or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public securityfundamental rights.
2021/07/23
Committee: CULT
Amendment 419 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting riskimpact assessments, very large online platforms shall take into account, in particular, howthe effects of their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of manifestly illegal content and of information that is incompatible with their terms and conditions.
2021/07/23
Committee: CULT
Amendment 420 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
2 a. The outcome of the impact assessment and supporting documents shall be communicated to the Board of Digital Service Coordinators and the Digital Services Coordinator of establishment. A summary version of the impact assessment shall be made publicly available in an easily accessible format.
2021/07/23
Committee: CULT
Amendment 421 #
Proposal for a regulation
Article 27 – title
Mitigation of riskadverse impacts
2021/07/23
Committee: CULT
Amendment 422 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic riskadverse impacts identified pursuant to Article 26, where mitigation is possible without adversely impacting other fundamental rights. Such measures may include, where applicable:
2021/07/23
Committee: CULT
Amendment 425 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
(e) initiating or adjusting cooperation with other online platforms through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively.deleted
2021/07/23
Committee: CULT
Amendment 426 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1 a. The decision as to the choice of measures shall remain with the platform.
2021/07/23
Committee: CULT
Amendment 427 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic riskadverse impacts reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
2021/07/23
Committee: CULT
Amendment 428 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to mitigate the systemic riskadverse impacts identified.
2021/07/23
Committee: CULT
Amendment 429 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelinerecommendations on the application of paragraph 1 in relation to specific riskadverse impacts, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelinerecommendations the Commission shall organise public consultations.
2021/07/23
Committee: CULT
Amendment 430 #
Proposal for a regulation
Article 13 a (new)
Article 13 a Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of showing digital advertising. 2. This provision shall not prevent intermediary services from displaying targeted digital advertising based on contextual information such as keywords, the language setting communicated by the device of the recipient or the digital location where the advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of a natural person or a clearly identifiable group of recipients/persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
2021/06/10
Committee: LIBE
Amendment 431 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:
2021/07/23
Committee: CULT
Amendment 432 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
(b) any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.deleted
2021/07/23
Committee: CULT
Amendment 433 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed the European Union Agency for Fundamental Rights. The Agency may decide to perform the audit in collaboration with by organisations which:
2021/07/23
Committee: CULT
Amendment 434 #
Proposal for a regulation
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
2021/07/23
Committee: CULT
Amendment 440 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2 a. In addition to the obligations applicable to all online platforms, very large online platforms shall offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties shall be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems.
2021/07/23
Committee: CULT
Amendment 442 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
2 b. Very large online platforms may only limit access to third-party recommender systems temporarily and in exceptional circumstances, when justified by an obligation under Article 18 of Directive (EU) 2020/0359 and Article 32(1)(c) of Regulation (EU) 2016/679. Such limitations shall be notified within 24 hours to affected third parties and to the Agency. The Agency may require such limitations to be removed or modified where it decides by majority vote they are unnecessary or disproportionate.
2021/07/23
Committee: CULT
Amendment 443 #
Proposal for a regulation
Article 29 – paragraph 2 c (new)
2 c. Very large online platforms shall not make commercial use of any of the data that is generated or received from third parties as a result of interoperability activities for purposes other than enabling those activities. Any processing of personal data related to those activities shall comply with Regulation (EU) 2016/679, in particular Articles 6(1)(a) and 5(1)(c).
2021/07/23
Committee: CULT
Amendment 450 #
Proposal for a regulation
Article 31 – paragraph 2
2. UpWith regards to moderation and reasoned request fromcommendation systems, very large online platforms shall make publicly available and communicate to the Digital Services Coordinator of establishment and/or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in upon request access to algorithms by providing the relevant source code and associated data that allow the detection of possible biases or threats to fundamental rights including freedom of expression. When disclosing these data, very large online platforms shall have a duty of explainability and ensure close cooperation with the Digital Services Coordinator or the Commission to make moderation and recommender systems fully understandable. When a bias is detected, very large online platforms should correct it expeditiously following requirements from the Digital Services Coordinator of establishment or the Commission. Very large online platforms shall be able to demonstrate their compliance at every step of the process pursuant to this Article 26(1).
2021/07/23
Committee: CULT
Amendment 451 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.deleted
2021/07/23
Committee: CULT
Amendment 453 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, in line with Directive (EU) 2016/943 and maintaining the security of their service.
2021/07/23
Committee: CULT
Amendment 454 #
Proposal for a regulation
Article 31 – paragraph 6
6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: (a) it does not have access to the data; (b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets.deleted
2021/07/23
Committee: CULT
Amendment 455 #
Proposal for a regulation
Article 31 – paragraph 6 – point a
(a) it does not have access to the data;deleted
2021/07/23
Committee: CULT
Amendment 456 #
Proposal for a regulation
Article 31 – paragraph 6 – point b
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets.deleted
2021/07/23
Committee: CULT
Amendment 457 #
Proposal for a regulation
Article 31 – paragraph 7
7. Requests for amendment pursuant to point (b) of paragraph 6 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request. The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request.deleted
2021/07/23
Committee: CULT
Amendment 458 #
Proposal for a regulation
Article 31 – paragraph 7 – subparagraph 1
The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request.deleted
2021/07/23
Committee: CULT
Amendment 461 #
Proposal for a regulation
Article 33 a (new)
Article 33 a Accesibility requirements 1. Very large online platforms which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU)2019/882. 2. Very large online platforms shall prepare the necessary information in accordance with Annex V of Directive(EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in written and oral format, including in a manner which is accessible to persons with disabilities. Intermediary service providers shall keep that information for as long as the service is in operation. 3. Very large online platforms shall ensure that information, and measures provided pursuant to Articles 10new (9), 12(1), 13(1), 14(1) and (5), 15(3) and (4), 17(1), (2) and (4), 23(2),24, 29(1) and (2), 30(1), and 33(1) are made available in a manner that they are easy to find, accessible to persons with disabilities. 4. Very large online platforms which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. 5. In the case of non-conformity, providers of intermediary services shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. and shall immediately inform the Digital Services Coordinator of establishment or other competent national authority of the Member States in which the service is established. 6. Very large online platforms shall, cooperate with the competent authority or Digital Services Coordinator, upon a reasoned request, and provide it with all information necessary to demonstrate the conformity of the service with the applicable accessibility requirements. 7. Very large online platforms shall be presumed to be in conformity with the accessibility requirements of this Regulation when they are in conformity with harmonised standards or parts there of the references of which have been published in the Official Journal of the European Union. 8. Very large online platforms which are in conformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements. 9. Very large online platforms shall, at least once a year, report to Digital Service Coordinators or other competent authorities on their obligation to ensure accessibility for persons with disabilities as required by this Regulation. 10. In addition to the information included in Article 44(2), activity reports by the Digital Services Coordinators shall include measures taken pursuant to Article 10 (new).
2021/07/23
Committee: CULT
Amendment 462 #
Proposal for a regulation
Article 35 – paragraph 1
1. The Commission and the Board shall encourage andmay facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic riskadverse impacts, in accordance with Union law, in particular on competition and the protection of personal data.
2021/07/23
Committee: CULT
Amendment 464 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic riskadverse impacts within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/23
Committee: CULT
Amendment 465 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.deleted
2021/07/23
Committee: CULT
Amendment 466 #
Proposal for a regulation
Article 35 – paragraph 4
4. The Commission and the Board shallmay assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shallmay regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions.
2021/07/23
Committee: CULT
Amendment 467 #
Proposal for a regulation
Article 35 – paragraph 5
5. The Board shallmay regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.
2021/07/23
Committee: CULT
Amendment 468 #
Proposal for a regulation
Article 36 – paragraph 1
1. The Commission shall encourage andmay facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30.
2021/07/23
Committee: CULT
Amendment 469 #
Proposal for a regulation
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.deleted
2021/07/23
Committee: CULT
Amendment 470 #
Proposal for a regulation
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
2021/07/23
Committee: CULT
Amendment 471 #
Proposal for a regulation
Article 37 – paragraph 2 – introductory part
2. The Commission shallmay encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures:
2021/07/23
Committee: CULT
Amendment 472 #
Proposal for a regulation
Article 37 – paragraph 2 – point a
(a) displaying prominent information on the crisis situation provided by Member States’ authorities or at Union level which are also accessible for persons with disabilities;
2021/07/23
Committee: CULT
Amendment 473 #
Proposal for a regulation
Article 37 – paragraph 3
3. The Commission may involve, as appropriate, Member States’ authorities and Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.
2021/07/23
Committee: CULT
Amendment 474 #
Proposal for a regulation
Article 37 – paragraph 4 – point f a (new)
(f a) measures to ensure accessibility for persons with disabilities during the implementation of crisis protocols, including by providing accessible description about these protocols;
2021/07/23
Committee: CULT
Amendment 475 #
Proposal for a regulation
Article 37 a (new)
Article 37 a Accountability and transparency 1. Before initiating or facilitating the negotiation or the revision of codes of conduct, the Commission shall (a) consider the appropriateness of proposing legislation; (b) publish the elements of the code which it could propose or advocate; (c) invite the European Parliament, the Council, the Fundamental Rights Agency, the public and, where relevant, the European Data Protection Supervisor to express their opinion and publish their opinions; (d) conduct a Fundamental Rights Impact Assessment and publish the findings. 2. The Commission shall subsequently publish the elements of the envisaged code, which it intends to propose or advocate in the negotiations. It shall not propose or advocate elements, which the European Parliament or the Council object to or which have not been subject to the process set out in paragraph 1. 3. The Commission shall allow representatives of non-governmental organisations, which advocate the interests of the recipients of relevant services, the European Parliament, the Council and the Fundamental Rights Agency to observe the negotiations and to have access to all documents pertaining to them. The Commission shall offer compensation to non-profit participants. 4. The Commission shall publish codes of conduct and their parties and keep the information updated. 5. This Article shall apply, mutatis mutandis, to crisis protocols.
2021/07/23
Committee: CULT
Amendment 498 #
Proposal for a regulation
Recital 62 a (new)
(62a) Recommender systems used by very large online platforms pose a particular risk in terms of consumer choice and lock-in effects. Consequently, in addition to the obligations applicable to all online platforms, very large online platforms should offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems, including through application programming interfaces.
2021/07/08
Committee: IMCO
Amendment 581 #
Proposal for a regulation
Article 13 a (new)
Article 13a Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of displaying digital advertising to a specific recipient or group of recipients. 2. This provision shall not prevent intermediary services from displaying targeted digital advertising based on contextual information such as keywords, the language or the approximate geographical location of the recipient of the service to whom the advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if the advertisement is displayed in real time and it does not allow for the direct or, by means of combining it with other information, indirect identification of a natural person or group of persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or group of persons.
2021/07/19
Committee: JURI
Amendment 783 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.deleted
2021/07/08
Committee: IMCO
Amendment 797 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
No provision of this Regulation shall prevent providers of intermediary services from offering end-to-end encrypted services, or make the provision of such services a cause for liability or loss of immunity.
2021/07/08
Committee: IMCO
Amendment 895 #
Proposal for a regulation
Article 9 a (new)
Article 9a Exclusion for micro enterprises and not- for-profit services This Chapter shall not apply to online platforms that qualify as micro enterprises within the meaning of the Annex to Recommendation 2003/361/EC or as a not-for-profit service with fewer than 100,000 monthly active users.
2021/07/08
Committee: IMCO
Amendment 936 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall publish summary versions of their terms and conditions in clear, user- friendly and unambiguous language, and in an easily accessible and machine- readable format. Such a summary shall include information on remedies and redress mechanisms pursuant to Articles 17 and 18, where available.
2021/07/08
Committee: IMCO
Amendment 955 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Any restriction referred to in paragraph 1 must respect the fundamental rights enshrined in the Charter and relevant national law.
2021/07/08
Committee: IMCO
Amendment 957 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2b. Individuals who are enforcing restrictions on the basis of terms and conditions of providers of intermediary services shall be given adequate initial and ongoing training on the applicable laws and international human rights standards, as well as on the action to be taken in case of conflict with the terms and conditions. Such individuals shall be provided with appropriate working conditions, including professional support, qualified psychological assistance and qualified legal advice, where relevant.
2021/07/08
Committee: IMCO
Amendment 960 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall notify the recipients of the service of any change to the contract terms and conditions that can affect their rights and provide a user-friendly explanation thereof. The changes shall not be implemented before the expiry of a notice period which is reasonable and proportionate to the nature and extent of the envisaged changes and to their consequences for the recipients of the service. That notice period shall be at least 15 days from the date on which the provider of intermediary services notifies the recipients about the changes. Failure to consent to such changes should not lead to basic services becoming unavailable.
2021/07/08
Committee: IMCO
Amendment 1034 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator canmay, in some cases, identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of valid notices containing all of the following elements:
2021/07/08
Committee: IMCO
Amendment 1041 #
Proposal for a regulation
Article 14 – paragraph 2 – point a a (new)
(aa) evidence that substantiates the claim, where possible;
2021/07/08
Committee: IMCO
Amendment 1045 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the exact electronic location of that information, in particular the exact URL or URLssuch as the URL or URLs or other identifiers where appropriate, and, where necessary, additional information enabling the identification of the alleged illegal content;
2021/07/08
Committee: IMCO
Amendment 1051 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
(c) the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;deleted
2021/07/08
Committee: IMCO
Amendment 1054 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.deleted
2021/07/08
Committee: IMCO
Amendment 1065 #
Proposal for a regulation
Article 14 – paragraph 4
4. WThere the notice contains the name and an electronic mail address of the individual or entity that submitted it, individual or entity that submitted the notice shall be given the option to provide an electronic mail address to enable the provider of hosting services shallto promptly send a confirmation of receipt of the notice to that individual or entity.
2021/07/08
Committee: IMCO
Amendment 1085 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Upon receipt of a valid notice, providers of hosting services shall act expeditiously to disable access to content which is manifestly illegal.
2021/07/08
Committee: IMCO
Amendment 1086 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
6b. Information that has been the subject of a notice and that is not manifestly illegal shall remain accessible while the assessment of its legality is still pending. Member States shall ensure that providers of intermediary services are not held liable for failure to remove notified information, while the assessment of legality is still pending.
2021/07/08
Committee: IMCO
Amendment 1090 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
6c. A decision taken pursuant to a notice submitted in accordance with Article 14(1) shall protect the rights and legitimate interests of all affected parties, in particular their fundamental rights as enshrined in the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue.
2021/07/08
Committee: IMCO
Amendment 1091 #
Proposal for a regulation
Article 14 – paragraph 6 d (new)
6d. The provider of hosting services shall ensure that processing of notices is undertaken by qualified individuals to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant professional support, qualified psychological assistance and legal advice.
2021/07/08
Committee: IMCO
Amendment 1128 #
Proposal for a regulation
Article 15 a (new)
Article 15a Content moderation 1. Providers of hosting services shall not use ex-ante control measures based on automated tools or upload-filtering of content for content moderation. Where providers of hosting services otherwise use automated tools for content moderation, they shall ensure that qualified staff decide on any action to be taken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are provided to staff, and that, where necessary, they are given the opportunity to seek professional support, qualified psychological assistance and qualified legal advice. This paragraph shall not apply to moderating information which has most likely been provided by automated tools. 2. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory, diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service as enshrined in the Charter.
2021/07/08
Committee: IMCO
Amendment 1153 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or, demote, disable access to or impose other sanctions against the information;
2021/07/08
Committee: IMCO
Amendment 1175 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendlincluding for persons with disabilities, user-friendly, non- discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner, including for persons with disabilities.
2021/07/08
Committee: IMCO
Amendment 1180 #
Proposal for a regulation
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner, non-discriminatory and non- arbitrary manner and within seven days starting on the date on which the online platform received the complaint. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1, without undue delay.
2021/07/08
Committee: IMCO
Amendment 1190 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means and are reviewed by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards as well as appropriate working conditions are to be provided, including, where relevant, professional support, qualified psychological assistance and legal advice..
2021/07/08
Committee: IMCO
Amendment 1215 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independent of online platforms, any third party involved in the dispute and recipients of the service provided by the online platforms;
2021/07/08
Committee: IMCO
Amendment 1328 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and, internal complaints- handling systems and out-of-court dispute settlement bodies referred to in Articles 14, 17 and 178, respectively, by individuals or entities or by complainants that frequentpeatedly submit notices or complaints or initiate dispute settlements that are manifestly unfounded.
2021/07/08
Committee: IMCO
Amendment 1337 #
Proposal for a regulation
Article 20 – paragraph 3 – point c
(c) the gravity of the misuses and its consequences, in particular on the exercise of fundamental rights, regardless of the absolute numbers or relative proportion;
2021/07/08
Committee: IMCO
Amendment 1342 #
Proposal for a regulation
Article 20 – paragraph 3 – point d a (new)
(da) the fact that notices and complaints were submitted following the use of an automated content recognition system;
2021/07/08
Committee: IMCO
Amendment 1343 #
Proposal for a regulation
Article 20 – paragraph 3 – point d b (new)
(db) any justification provided by the recipient of the service to provide sufficient grounds to consider that the information is not manifestly illegal.
2021/07/08
Committee: IMCO
Amendment 1347 #
Proposal for a regulation
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner with due regard to their obligations under Article 12(2) in particular as regards the applicable fundamental rights of the recipients of the service as enshrined in the Charter, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
2021/07/08
Committee: IMCO
Amendment 1518 #
Proposal for a regulation
Article 24 a (new)
Article 24a Recommender systems 1. Online platforms that use recommender systems or any other system used to select and determine the order of presentation of content shall set out in their terms and conditions, in a clear, accessible and easily comprehensible format, the parameters used in their recommender systems, as well as the options provided to the recipients of the service to select or modify those parameters. 2. The parameters referred to in paragraph 1 shall include at least the following information: (a) the criteria and logic used by the recommender systems, including input data and performance metrics; (b) how these criteria are weighted against each other; (c) the optimisation goal of the recommender systems; (d) an explanation of how the behaviour of the recipients of the service may impact the functioning and outputs of the recommender systems. 3. Online platforms shall provide options for the recipients of the service to access their profile to select and modify the parameters of the relevant recommender system, including at least one option which is not based on profiling within the meaning of Article 4 (4) of Regulation (EU) 2016/679 and which is activated by default.
2021/07/08
Committee: IMCO
Amendment 1544 #
Proposal for a regulation
Article 26 – title
RiskImpact assessment
2021/07/08
Committee: IMCO
Amendment 1553 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming fromthe impact of the functioning and use made of their services in the Union. This risk on fundamental rights, including article 38 of the Charter of Fundamental Rights of the European Union and on ensuring a high level of consumer protection. This impact assessment shall be specific to their services and shall include the following systemic riskadverse impacts:
2021/07/08
Committee: IMCO
Amendment 1558 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a) the dissemination of manifestly illegal content through their services;
2021/07/08
Committee: IMCO
Amendment 1571 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, including article 38 of the Charter of Fundamental Rights of The European Union and in particular the rights to respect for private and family life, freedom of expression and information, freedom of the press the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectivelythe Charter;
2021/07/08
Committee: IMCO
Amendment 1575 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) malfunctioning or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public securityfundamental rights as foreseen by the Charter of Fundamental rights of the European Union, including Article 38 on ensuring a high level of consumer protection.
2021/07/08
Committee: IMCO
Amendment 1591 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting riskimpact assessments, very large online platforms shall take into account, in particular, howthe effects of their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of manifestly illegal content and of information that is incompatible with their terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1594 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
2a. The outcome of the impact assessment and supporting documents shall be communicated to the Board of Digital Service Coordinators and the Digital Services Coordinator of establishment. A summary version of the impact assessment shall be made publicly available in an easily accessible format.
2021/07/08
Committee: IMCO
Amendment 1598 #
Proposal for a regulation
Article 27 – title
Mitigation of riskadverse impacts
2021/07/08
Committee: IMCO
Amendment 1600 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic riskadverse impacts identified pursuant to Article 26, where mitigation is possible without adversely impacting other fundamental rights. Such measures may include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1621 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
(e) initiating or adjusting cooperation with other online platforms through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively.deleted
2021/07/08
Committee: IMCO
Amendment 1629 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. The decision as to the choice of measures shall remain with the platform.
2021/07/08
Committee: IMCO
Amendment 1635 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic riskadverse impacts reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33;
2021/07/08
Committee: IMCO
Amendment 1640 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to mitigate the systemic riskadverse impacts identified.
2021/07/08
Committee: IMCO
Amendment 1645 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelinerecommendations on the application of paragraph 1 in relation to specific riskimpacts, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelinerecommendations the Commission shall organise public consultations.
2021/07/08
Committee: IMCO
Amendment 1652 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following:obligations set out in Chapter III.
2021/07/08
Committee: IMCO
Amendment 1655 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III;deleted
2021/07/08
Committee: IMCO
Amendment 1660 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
(b) any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.deleted
2021/07/08
Committee: IMCO
Amendment 1663 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by the European Union Agency for Fundamental Rights. The Agency may decide to perform the audit in collaboration with organisations which:
2021/07/08
Committee: IMCO
Amendment 1681 #
Proposal for a regulation
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non- compliance identified.
2021/07/08
Committee: IMCO
Amendment 1703 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2a. In addition to the obligations applicable to all online platforms, very large online platforms shall offer to the recipients of the service the choice of using recommender systems from third party providers, where available. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the platform of its own recommender systems.
2021/07/08
Committee: IMCO
Amendment 1705 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
2b. Very large online platforms may only limit access to third-party recommender systems temporarily and in exceptional circumstances, when justified by an obligation under Article 18 of Directive (EU) 2020/0359 and Article 32(1)(c) of Regulation (EU) 2016/679. Such limitations shall be notified within 24 hours to affected third parties and to the Agency. The Agency may require such limitations to be removed or modified where it decides by majority vote they are unnecessary or disproportionate.
2021/07/08
Committee: IMCO
Amendment 1706 #
Proposal for a regulation
Article 29 – paragraph 2 c (new)
2c. Very large online platforms shall not make commercial use of any of the data that is generated or received from third parties as a result of interoperability activities for purposes other than enabling those activities. Any processing of personal data related to those activities shall comply with Regulation (EU) 2016/679, in particular Articles 6(1)(a) and 5(1)(c).
2021/07/08
Committee: IMCO
Amendment 1806 #
Proposal for a regulation
Article 33 a (new)
Article 33a Interoperability 1. Very large online platforms shall make the core functionalities of their services interoperable to enable cross- platform exchange of information with third parties. Very large online platforms shall publicly document all application programming interfaces they make available to that end. 2. Very large online platforms may only limit access to their core functionalities temporarily and in exceptional circumstances, when justified by an obligation under Article 18 of Directive [XX] on measures for a high common level of cybersecurity across the Union, repealing Directive (EU) 2016/1148 or Article 32(1)(c) of Regulation (EU) 2016/679 . Such limitations shall be notified within 24 hours to affected third parties and to the Agency. The Agency may require such limitations to be removed or modified where it decides by majority vote they are unnecessary or disproportionate. 3. Very large online platforms shall not make commercial use of any of the data that is generated or received from third parties as a result of interoperability activities for purposes other than enabling those activities. Any processing of personal data related to those activities shall comply with Regulation (EU) 2016/679, in particular Articles 6(1)(a) and 5(1)(c). 4. The Commission shall adopt implementing measures specifying the nature and scope of the obligations set out in paragraph 1, including open standards and protocols such as application programming interfaces.
2021/07/08
Committee: IMCO
Amendment 1850 #
Proposal for a regulation
Article 35 – paragraph 1
1. The Commission and the Board shall encourage andmay facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic riskadverse impacts, in accordance with Union law, in particular on competition and the protection of personal data.
2021/07/08
Committee: IMCO
Amendment 1857 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic riskadverse impacts within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/08
Committee: IMCO
Amendment 1861 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.deleted
2021/07/08
Committee: IMCO
Amendment 1871 #
Proposal for a regulation
Article 35 – paragraph 4
4. The Commission and the Board shallmay assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shallmay regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions.
2021/07/08
Committee: IMCO
Amendment 1876 #
Proposal for a regulation
Article 35 – paragraph 5
5. The Board shallmay regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.
2021/07/08
Committee: IMCO
Amendment 1884 #
Proposal for a regulation
Article 36 – paragraph 1
1. The Commission shallmay encourage and facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30.
2021/07/08
Committee: IMCO
Amendment 1898 #
Proposal for a regulation
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
2021/07/08
Committee: IMCO
Amendment 1899 #
Proposal for a regulation
Article 37 – paragraph 2 – introductory part
2. The Commission shallmay encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures:
2021/07/08
Committee: IMCO
Amendment 1901 #
Proposal for a regulation
Article 37 – paragraph 3
3. The Commission may involve, as appropriate, Member States’ authorities and Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.
2021/07/08
Committee: IMCO
Amendment 1905 #
Proposal for a regulation
Article 37 a (new)
Article 37a Accountability and transparency 1. Before initiating or facilitating the negotiation or the revision of codes of conduct, the Commission shall (a) consider the appropriateness of proposing legislation; (b) publish the elements of the code which it could propose or advocate; (c) invite the European Parliament, the Council, the Fundamental Rights Agency, the public and, where relevant, the European Data Protection Supervisor to express their opinion and publish their opinions; (d) conduct a Fundamental Rights Impact Assessment and publish the findings. 2. The Commission shall subsequently publish the elements of the envisaged code which it intends to propose or advocate in the negotiations. It shall not propose or advocate elements which the European Parliament or the Council object to or which have not been subject to the process set out in paragraph 1. 3. The Commission shall allow representatives of non-governmental organisations which advocate the interests of the recipients of relevant services, the European Parliament, the Council and the Fundamental Rights Agency to observe the negotiations and to have access to all documents pertaining to them. The Commission shall offer compensation to non-profit participants. 4. The Commission shall publish codes of conduct and their parties and keep the information updated. 5. This Article shall apply, mutatis mutandis, to crisis protocols.
2021/07/08
Committee: IMCO