BETA

44 Amendments of Maite PAGAZAURTUNDÚA related to 2020/0361(COD)

Amendment 144 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, search engine, livestreaming platforms, messaging services or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/06/10
Committee: LIBE
Amendment 146 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. Accordingly, where access to information requires registration or admission to a user group, such information should only be considered to be publicly available when users seeking to access such information are automatically registered or admitted without human intervention to decide or select the users to whom access is granted. The mere possibility to create groups of users of a given service, including a messagings service, should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/06/10
Committee: LIBE
Amendment 159 #
Proposal for a regulation
Recital 24
(24) The exemptions from liability established in this Regulation should not affect the possibility of injunctions of different kinds against providers of intermediary services, even where they meet the conditions set out as part of those exemptions. Such injunctions could, in particular, consist of orders by courts or administrative authorities requiring the termination or prevention of any infringement, including the removal of illegal content specified in such orders, issued in compliance with Union law, or the disabling of access to it. Such orders, in particular where they require the provider to prevent that illegal content reappears, must be issued in compliance with Union law, in particular with the prohibition of general monitoring obligations, as interpreted by the Court of Justice of the European Union.
2021/06/10
Committee: LIBE
Amendment 167 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. Member states should however have the possibility to require from service providers, who host information provided by users of their service, to apply diligent duty of care.
2021/06/10
Committee: LIBE
Amendment 171 #
Proposal for a regulation
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to thto ensure the effective processing of those orders.
2021/06/10
Committee: LIBE
Amendment 187 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should prevent the reappearance of the notified illegal information. The provider should also inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/06/10
Committee: LIBE
Amendment 188 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means that have been proven to be efficient, proportionate and accurate, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/06/10
Committee: LIBE
Amendment 193 #
Proposal for a regulation
Recital 4 a (new)
(4a) As Party to the United Nations Convention on the Rights of Persons with Disabilities (UN CRPD), provisions of the Convention are integral part of the Union legal order and binding upon the Union and its Member States. The UN CRPD requires its Parties to take appropriate measures to ensure that persons with disabilities have access, on an equal basis with others, to information and communications technologies and systems, and other facilities and services open or provided to the public, both in urban and in rural areas. General Comment No 2 to the UNCRPD further states that “The strict application of universal design to all new goods, products, facilities, technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity1a”.Given the ever-growing importance of digital services and platforms in private and public life, in line with the obligations enshrined in the UN CRPD, the EU must ensure a regulatory framework for digital services which protects rights of all recipients of services, including persons with disabilities. Declaration 22 annexed to the final Act of Amsterdam provides that the institutions of the Union are to take account of the needs of persons with disabilities in drawing up measures under Article 114 TFEU. __________________ 1aGeneral comment No. 2 (2014) on Article 9: Accessibility of the UN Convention on the Rights of Persons with Disabilities.
2021/07/08
Committee: IMCO
Amendment 197 #
Proposal for a regulation
Recital 5 a (new)
(5a) Given the cross-border nature of the services at stake, Union action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and ensure that equal right to access and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end- users. Therefore, accessibility requirements for intermediary services, including their user interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act1a and the Web Accessibility Directive1b, so that no one is left behind as result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals. __________________ 1aDirective (EU) 2019/882 of the European Parliament and of the Council of 17 April 2019 on the accessibility requirements for products and services 1bDirective (EU) 2016/2102 of the European Parliament and of the Council of 26 October 2016 on the accessibility of the websites and mobile applications of public sector bodies
2021/07/08
Committee: IMCO
Amendment 198 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content for instance, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/06/10
Committee: LIBE
Amendment 201 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and reliable safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/06/10
Committee: LIBE
Amendment 203 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, notably when it concerns vulnerable users such as children, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/06/10
Committee: LIBE
Amendment 209 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/06/10
Committee: LIBE
Amendment 210 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, t specifically regarding disinformation, misinformation, hate speech or any other types of harmful content. There being no alternative and less restrictive measures that would effectively achieve the same result.
2021/06/10
Committee: LIBE
Amendment 213 #
Proposal for a regulation
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause real societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
2021/06/10
Committee: LIBE
Amendment 216 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or, hampering competition or the way platforms' terms and conditions including content moderation policies, are enforced, including through automatic means. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discoursefundamental rights, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/06/10
Committee: LIBE
Amendment 223 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions as well as making content moderation policies, as well as the way they are enforced fully transparent for the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/06/10
Committee: LIBE
Amendment 229 #
Proposal for a regulation
Recital 60
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate, efficient and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
2021/06/10
Committee: LIBE
Amendment 241 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms, such as the dissemination of illegal and harmful content, brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/06/10
Committee: LIBE
Amendment 246 #
Proposal for a regulation
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation, and encourage online platforms to follow those codes. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
2021/06/10
Committee: LIBE
Amendment 249 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area forspect which needs to be considerationed is the possible negative impacts of systemic risks on society and democracy, such as disinformation, harmful content, in particular hate speech, or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
2021/06/10
Committee: LIBE
Amendment 251 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, since the Commission willhas issued guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan, in May 2021.
2021/06/10
Committee: LIBE
Amendment 253 #
Proposal for a regulation
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross- border response in the online environment in the public interest. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.
2021/06/10
Committee: LIBE
Amendment 367 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasonspecific reasons such as ones related to the prevention, investigation, detection and prosecution of criminal offences;
2021/06/10
Committee: LIBE
Amendment 445 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned and shall create an obligation on behalf of the notified provider of hosting services to remove or disable access to the notified information expeditiously.
2021/06/10
Committee: LIBE
Amendment 455 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where theydecisions to remove or disable information are taken, they shall extend to preventing the reappearance of that information. Where providers of hosting services use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
2021/06/10
Committee: LIBE
Amendment 557 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. , unless those manifestly illegal contents were due to wrongful notices and complaints as described in point 2 of this article.
2021/06/10
Committee: LIBE
Amendment 578 #
Proposal for a regulation
Article 21 – paragraph 1
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall remove or disable the content and promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.
2021/06/10
Committee: LIBE
Amendment 591 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing clearly between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
2021/06/10
Committee: LIBE
Amendment 610 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/07/08
Committee: IMCO
Amendment 628 #
Proposal for a regulation
Article 1 – paragraph 5 – point b a (new)
(ba) Audiovisual Services Directive (EU) 2018/1808.
2021/07/08
Committee: IMCO
Amendment 630 #
Proposal for a regulation
Article 1 – paragraph 5 – point b b (new)
(bb) Directive (EU) 2019/882 (European Accessibility Act).
2021/07/08
Committee: IMCO
Amendment 632 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how and whether their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/06/10
Committee: LIBE
Amendment 640 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to address the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/06/10
Committee: LIBE
Amendment 739 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1), which is in the public interest.
2021/06/10
Committee: LIBE
Amendment 740 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) ‘persons with disabilities’ means persons within the meaning of Article 3 (1) of Directive (EU) 2019/882.
2021/07/08
Committee: IMCO
Amendment 785 #
Proposal for a regulation
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall supervise the monitoring of the application of those codes two years after the application of this Regulation.
2021/06/10
Committee: LIBE
Amendment 892 #
Proposal for a regulation
Chapter III – title
Due diligence obligations for a transparent, accessible and safe online environment
2021/07/08
Committee: IMCO
Amendment 906 #
Proposal for a regulation
Article 10 a (new)
Article 10a Accessibility requirements for intermediary services 1. Providers of intermediary services which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882. 2. Providers of intermediary services shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in written and oral format, including in a manner which is accessible to persons with disabilities. Intermediary service providers shall keep that information for as long as the service is in operation. 3. Providers of intermediary services shall ensure that information, forms and measures provided pursuant to Articles 10 new (9), 12(1), 13(1), 14(1) and (5), 15(3) and (4), 17(1), (2) and(4), 23(2), 24, 29(1) and (2), 30(1), and 33(1) are made available in a manner that they are easy to find, accessible to persons with disabilities, and do not exceed a level of complexity superior to level B1(intermediate) of the Council of Europe’s Common European Framework of Reference for Languages. 4. Providers of intermediary services which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services. 5. In the case of non-conformity, providers of intermediary services shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. Furthermore, where the service is not compliant with applicable accessibility requirements, the provider of the intermediary service shall immediately inform the Digital Services Coordinator of establishment or other competent national authority of the Member States in which the service is established, to that effect, giving details, in particular, of the non- compliance and of any corrective measures taken. 6. Provider of intermediary services shall, further to a reasoned request from a competent authority, provide it with all information necessary to demonstrate the conformity of the service with the applicable accessibility requirements. They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements. 7. Intermediary services which are in conformity with harmonised standards or parts thereof the references of which have been published in the Official Journal of the European Union, shall be presumed to be in conformity with the accessibility requirements of this Regulation insofar as those standards or parts thereof cover those requirements. 8. Intermediary services which are in conformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation insofar as those technical specifications or parts thereof cover those requirements. 9. All intermediary services shall, at least once a year, report to their respective Digital Service Coordinators or other competent authorities on their progress in implementing the obligation to ensure accessibility for persons with disabilities as required by this Regulation. In addition to Article 44 (2), Digital Services Coordinators shall include measures taken pursuant to this article.
2021/07/08
Committee: IMCO
Amendment 1223 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
2021/07/08
Committee: IMCO
Amendment 1233 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
(d) it is capable of settling dispute in a swift, efficient, accessible for persons with disabilities, and cost-effective manner and in at least one official language of the Union;
2021/07/08
Committee: IMCO
Amendment 1825 #
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
(fa) accessibility of elements and functions of online platforms and digital services for persons with disabilities aiming at consistency and coherence with existing harmonised accessibility requirements when these elements and functions are not already covered by existing harmonised European standards.
2021/07/08
Committee: IMCO
Amendment 1900 #
Proposal for a regulation
Article 37 – paragraph 2 – point a
(a) displaying prominent information on the crisis situation provided by Member States’ authorities or at Union level which are accessible for persons with disabilities;
2021/07/08
Committee: IMCO
Amendment 1902 #
Proposal for a regulation
Article 37 – paragraph 4 – point f a (new)
(fa) measures to ensure accessibility for persons with disabilities during implementation of crisis protocols, including by providing accessible description about these protocols.
2021/07/08
Committee: IMCO