BETA

Activities of Anna Júlia DONÁTH related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (81)

Amendment 144 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, search engine, livestreaming platforms, messaging services or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/06/10
Committee: LIBE
Amendment 162 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken in order to detect, identify and act on illegal pieces of content on a voluntary basis, should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/06/10
Committee: LIBE
Amendment 201 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and, proportionate and reliable safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/06/10
Committee: LIBE
Amendment 203 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, notably when it concerns vulnerable users such as children, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/06/10
Committee: LIBE
Amendment 209 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have an easy access to information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/06/10
Committee: LIBE
Amendment 210 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, t specifically regarding disinformation, misinformation, hate speech or any other types of harmful content. There being no alternative and less restrictive measures that would effectively achieve the same result.
2021/06/10
Committee: LIBE
Amendment 223 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions as well as making content moderation policies, as well as the way they are enforced fully transparent for the users. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/06/10
Committee: LIBE
Amendment 249 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area forspect which needs to be considerationed is the possible negative impacts of systemic risks on society and democracy, such as disinformation, harmful content, in particular hate speech, or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
2021/06/10
Committee: LIBE
Amendment 254 #
Proposal for a regulation
Recital 75
(75) Member States can designate an existing national authority with the function of the Digital Services Coordinator, or with specific tasks to apply and enforce this Regulation, provided that any such appointed authority complies with the requirements laid down in this Regulation, such as in relation to its independence. Moreover, Member States are in principle not precluded from merging functions within an existing authority, in accordance with Union law. They should however refrain from designating the same authorities as those designated pursuant to article 30 of the Audio Visual Media Services Directive in order to avoid providing one single institution with the authority to shape the Member State’s entire media landscape and online space. The measures to that effect may include, inter alia, the preclusion to dismiss the President or a board member of a collegiate body of an existing authority before the expiry of their terms of office, on the sole ground that an institutional reform has taken place involving the merger of different functions within one authority, in the absence of any rules guaranteeing that such dismissals do not jeopardise the independence and impartiality of such members.
2021/06/10
Committee: LIBE
Amendment 288 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(h a) ‘very large online platform’ means a provider of a hosting service which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3;
2021/06/10
Committee: LIBE
Amendment 311 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
2021/06/10
Committee: LIBE
Amendment 342 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, health and trust of the recipients of the service, including minors, women and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empowerprovide recourse to recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 401 #
2. Providers of intermediary services shall act in a diligent, objective, timely, and proportionate and non-discriminatory manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicabl. The fundamental rights of the recipients of the service as enshrined in the Charter shall be applied in particular when limitations imposed.
2021/06/10
Committee: LIBE
Amendment 406 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2 a. Terms and conditions of providers of intermediary services shall respect the essential principles of fundamental rights as enshrined in the Charter.
2021/06/10
Committee: LIBE
Amendment 409 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2 b. Terms and conditions that do not comply with this Article and with the Charter of Fundamental Rights shall be considered invalid.
2021/06/10
Committee: LIBE
Amendment 411 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2 c. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions.That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies available. .
2021/06/10
Committee: LIBE
Amendment 416 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services 1. shall publish in an easily accessible manner, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. The reports must be searchable and archived for further use. Those reports shall include, in particular, information on the following, as applicable:
2021/06/10
Committee: LIBE
Amendment 458 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and, objective and non- discriminatory manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
2021/06/10
Committee: LIBE
Amendment 462 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6 a. If the recipient of services notices the hosting services their disagreement with the automated means of decision- making, hosting services must ensure human review of the decision-making process before any action taken.
2021/06/10
Committee: LIBE
Amendment 478 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service through the submission of abusive notices, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 503 #
Proposal for a regulation
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner within 3 days after submission. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
2021/06/10
Committee: LIBE
Amendment 512 #
Proposal for a regulation
Article 18 – paragraph 1 – introductory part
1. Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of- court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. Out-of-court dispute settlement shall be carried out within 30 days after submission.
2021/06/10
Committee: LIBE
Amendment 516 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and is legally distinct from and functionally independent of the government of the Member State or any other public or private body;
2021/06/10
Committee: LIBE
Amendment 529 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
6 a. This article is without prejudice to the provisions laid down in Article 43 concerning the ability of recipients of the services to file complaints with the Digital Services Coordinator of their country of residence or in the case of very large online platforms, the Commission.
2021/06/10
Committee: LIBE
Amendment 532 #
Proposal for a regulation
Article 18 a (new)
Article 18 a Burden of Proof The rules on the burden of proof shall be shifted back to the providers of hosting services whether an information constitutes legal or illegal content.
2021/06/10
Committee: LIBE
Amendment 546 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
(c a) it is legally distinct from and functionally independent of the government of the Member State or any other public or private body;
2021/06/10
Committee: LIBE
Amendment 597 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
(a) that the information displayed is an advertisementon the interface or parts thereof is an online advertisement, including through prominent and harmonised marking;
2021/06/10
Committee: LIBE
Amendment 598 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
2021/06/10
Committee: LIBE
Amendment 603 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c) clear, meaningful and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed.;
2021/06/10
Committee: LIBE
Amendment 605 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
(c a) if the advertisement was displayed using an automated tool and the identity of the person responsible for that tool;
2021/06/10
Committee: LIBE
Amendment 608 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
The Commission shall adopt an implementing act establishing harmonised specifications for the marking referred to in paragraph 1(a) of this Article.
2021/06/10
Committee: LIBE
Amendment 610 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Providers of intermediary services shall inform the natural or legal person on whose behalf the advertisement is displayed where the advertisement has been displayed.
2021/06/10
Committee: LIBE
Amendment 611 #
Proposal for a regulation
Article 24 a (new)
Article 24 a Recipients’ consent for advertising practices 1. Providers of intermediary services shall, by default, not make the recipients of their services subject to targeted, micro targeted and behavioural advertising unless the recipient of the service has expressed a freely given, specific, informed and unambiguous consent. Providers of intermediary services shall ensure that recipients of services can easily make an informed choice when expressing their consent by providing them with meaningful information, including information about the value of giving access to and about the use of their data. 2. When asking for the consent of recipients of their services considered as vulnerable consumers, providers of intermediary services shall implement all the necessary measures to ensure that such consumers have received enough and relevant information before they give their consent. 3. When processing data for targeted, micro-targeted and behavioural advertising, online intermediaries shall comply with relevant Union law and shall not engage in activities that can lead to pervasive tracking, such as disproportionate combination of data collected by platforms, or disproportionate processing of special categories of data that might be used to exploit vulnerabilities. 4. Providers of intermediary services shall organise their online interface in such a way that recipients of services, in particular those considered as vulnerable consumers, can easily and efficiently access and modify advertising parameters. Providers of intermediary services shall monitor the use of advertising parameters by recipients of services on a regular basis and make best efforts to improve their awareness about the possibility to modify those parameters.
2021/06/10
Committee: LIBE
Amendment 702 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(ha) ‘very large online platform’ means a provider of a hosting service which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3;
2021/07/08
Committee: IMCO
Amendment 729 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
(b) tThe natural or legal person on whose behalf the advertisement is displayed and any related payments received;
2021/06/10
Committee: LIBE
Amendment 736 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article or civil society organisations engaged in monitoring Rule of Law, Fundamental Rights and European values, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1) or educational purposes.
2021/06/10
Committee: LIBE
Amendment 745 #
Proposal for a regulation
Article 31 – paragraph 3 a (new)
3 a. Upon request by the recipient of the service, or at least once a year, very large online platforms shall make available to the recipient of the service comprehensive information about the data concerning the recipient of the service that was used in the previous year. The information shall encompass a listing of the data that was collected, how it was used and with what third parties it was shared. Online platforms shall present this information in a way that makes it easy to understand.
2021/06/10
Committee: LIBE
Amendment 749 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1, 2 and 23a and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/06/10
Committee: LIBE
Amendment 773 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
2021/07/08
Committee: IMCO
Amendment 785 #
Proposal for a regulation
Article 36 – paragraph 3
3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall supervise the monitoring of the application of those codes two years after the application of this Regulation.
2021/06/10
Committee: LIBE
Amendment 790 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph -1 (new)
-1 Member States shall not designate the regulatory authorities referred to in Article 30 of the Directive 2010/13/EU of the European Parliament and of the Council on of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services as competent authorities or as Digital Services Coordinator.
2021/06/10
Committee: LIBE
Amendment 793 #
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph -1 (new)
-1 When a Member State is subject to a procedure referred to in Article 7(1) or 7(2) of the Treaty on European Union or against whom a procedure based on Regulation 2020/2092 was initiated, the Commission shall additionally confirm that the Digital Services Coordinator proposed by that Member State fulfils the requirements laid down in Article 39 before that Digital Services Coordinator can be designated.
2021/06/10
Committee: LIBE
Amendment 794 #
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph -1 a (new)
-1 a This paragraph applies mutatis mutandis to the certification process for out-of-court dispute settlement bodies as described in Article 18(2) and the award of the status of trusted flagger as described in Article 19(2).
2021/06/10
Committee: LIBE
Amendment 795 #
Proposal for a regulation
Article 39 – paragraph -1 (new)
-1. Member States shall ensure that the Digital Services Coordinators are legally distinct from the government and functionally independent of their respective governments and of any other public or private body.
2021/06/10
Committee: LIBE
Amendment 806 #
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established or, in the case of very large online platforms, the Commission. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment or, in the case of very large online platforms, the Commission. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
2021/06/10
Committee: LIBE
Amendment 868 #
Proposal for a regulation
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor and audit the effective implementation and compliance with this Regulation and the Charter of Fundamental Rights by the very large online platform concerned, including the operation of any algorithm in the provision of its services. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms.
2021/06/10
Committee: LIBE
Amendment 942 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective, timely, and proportionate and non-discriminatory manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicabl. The fundamental rights of the recipients of the service as enshrined in the Charter shall be applied in particular when limitations imposed.
2021/07/08
Committee: IMCO
Amendment 962 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2c. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies available.
2021/07/08
Committee: IMCO
Amendment 974 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish in an easily accessible manner, at least ontwice a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. The reports must be searchable and archived for further use. Those reports shall include, in particular, information on the following, as applicable:
2021/07/08
Committee: IMCO
Amendment 1076 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and, objective and non- discriminatory manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
2021/07/08
Committee: IMCO
Amendment 1084 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. If the recipient of services notices the hosting services their disagreement with the automated means of decision- making, hosting services must ensure human review of the decision-making process before any action taken
2021/07/08
Committee: IMCO
Amendment 1183 #
Proposal for a regulation
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timdiligently, objectively and without undue delay, diligent and objective mannerbut no later than 10 days after submission.. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
2021/07/08
Committee: IMCO
Amendment 1202 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. Out-of-court dispute settlement shall be carried out within 45 days after submission.
2021/07/08
Committee: IMCO
Amendment 1211 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independent of online platforms and recipients of the service provided by the online platforms and is legally distinct from and functionally independent of the government of the Member State or any other public or related private body;
2021/07/08
Committee: IMCO
Amendment 1244 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
2a. Certified out of court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time and no later than 90 calendar days after the date on which the certified body has received the complaint.
2021/07/08
Committee: IMCO
Amendment 1253 #
Proposal for a regulation
Article 18 – paragraph 6 a (new)
6a. This Article is without prejudice to the provisions laid down in Article 43 concerning the ability of recipients of the services to file complaints with the Digital Services Coordinator of their country of residence or in the case of very large online platforms, the Commission.
2021/07/08
Committee: IMCO
Amendment 1256 #
Proposal for a regulation
Article 18 a (new)
Article 18a Burden of proof The rules on the burden of proof shall be shifted back to the providers of hosting services whether an information constitutes legal or illegal content.
2021/07/08
Committee: IMCO
Amendment 1284 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
(ca) it is legally distinct from and functionally independent of the government of the Member State or any other public or private body;
2021/07/08
Committee: IMCO
Amendment 1488 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
2021/07/08
Committee: IMCO
Amendment 1496 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c) clear, meaningful and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed.
2021/07/08
Committee: IMCO
Amendment 1502 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
(ca) if the advertisement was displayed using an automated tool and the identity of the person responsible for that tool;
2021/07/08
Committee: IMCO
Amendment 1511 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
The Commission shall adopt an implementing act establishing harmonised specifications for the marking referred to in paragraph 1(a) of this Article.
2021/07/08
Committee: IMCO
Amendment 1514 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
Providers of intermediary services shall inform the natural or legal person on whose behalf the advertisement is displayed where the advertisement has been displayed.
2021/07/08
Committee: IMCO
Amendment 1550 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1563 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1606 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1626 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in Article 28(3).
2021/07/08
Committee: IMCO
Amendment 1658 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27
2021/07/08
Committee: IMCO
Amendment 1722 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
(b) tThe natural or legal person on whose behalf the advertisement is displayed and any related payments received;
2021/07/08
Committee: IMCO
Amendment 1755 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article or civil society organisations engaged in monitoring Rule of Law, Fundamental Rights and European values, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1) or educational purposes.
2021/07/08
Committee: IMCO
Amendment 1760 #
Proposal for a regulation
Article 31 – paragraph 3 a (new)
3a. Upon request by the recipient of the service, or at least once a year, very large online platforms shall make available to the recipient of the service comprehensive information about the data concerning the recipient of the service that was used in the previous year. The information shall encompass a listing of the data that was collected, how it was used and with what third parties it was shared. Online platforms shall present this information in a way that makes it easy to understand.
2021/07/08
Committee: IMCO
Amendment 1773 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1, 2 and 23a and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/07/08
Committee: IMCO
Amendment 1909 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph -1 (new)
-1. Member States shall not designate the regulatory authorities referred to in Article 30 of the Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services as competent authorities or as Digital Services Coordinator.
2021/07/08
Committee: IMCO
Amendment 1910 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 1 a (new)
When a Member State is subject to a procedure referred to in Article 7(1) or 7(2) of the Treaty on European Union or against whom a procedure based on Regulation 2020/2092 was initiated, the Commission shall additionally confirm that the Digital Services Coordinator proposed by that Member State fulfils the requirements laid down in Article 39 before that Digital Services Coordinator can be designated.
2021/07/08
Committee: IMCO
Amendment 1915 #
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 2 a (new)
This paragraph applies mutatis mutandis to the certification process for out-of- court dispute settlement bodies as described in Article 18(2) and the award of the status of trusted flagger as described in Article 19(2).
2021/07/08
Committee: IMCO
Amendment 1921 #
Proposal for a regulation
Article 39 – paragraph 1 – subparagraph 1 a (new)
Member States shall ensure that the Digital Services Coordinators are legally distinct from the government and functionally independent of their respective governments and of any other public or private body.
2021/07/08
Committee: IMCO
Amendment 1969 #
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established or with in the case of very large online platforms, the Commission. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment or in the case of very large online platforms, the Commission. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
2021/07/08
Committee: IMCO
Amendment 2099 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/08
Committee: IMCO
Amendment 2120 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/08
Committee: IMCO
Amendment 2130 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/08
Committee: IMCO
Amendment 2179 #
Proposal for a regulation
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor and audit the effective implementation and compliance with this Regulation and the Charter of Fundamental Rights by the very large online platform concerned, including the operation of any algorithm in the provision of its services. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms.
2021/07/08
Committee: IMCO