BETA

Activities of Clare DALY related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
2021/09/03
Committee: LIBE
Dossiers: 2020/0361(COD)
Documents: PDF(436 KB) DOC(279 KB)
Authors: [{'name': 'Patrick BREYER', 'mepid': 197431}]

Amendments (197)

Amendment 135 #
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. _________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
2021/06/10
Committee: LIBE
Amendment 137 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, t. That concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether, where the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in questionthe Charter.
2021/06/10
Committee: LIBE
Amendment 147 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to meanAccordingly, where access to information requires registration or admittance to a group of users, that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulationshould be considered to have been disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision on whom to grant access. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/06/10
Committee: LIBE
Amendment 160 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.deleted
2021/06/10
Committee: LIBE
Amendment 192 #
Proposal for a regulation
Recital 4 a (new)
(4a) Online advertisement plays an important role in the online environment, including in relation to the provision of the information society services. However, certain forms of online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to creating financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, to misleading or exploitative marketing or the discriminatory display of advertising with an impact on the equal treatment and the rights of consumers. Consumers are largely unaware of the volume and granularity of the data that is being collected and used to deliver personalised and micro-targeted advertisements, and have little agency and limited ways to stop or control data exploitation. The significant reach of a few online platforms, their access to extensive datasets and participation at multiple levels of the advertising value chain has created challenges for businesses, traditional media services and other market participants seeking to advertise or develop competing advertising services. In addition to the information requirements resulting from Article 6 of Directive 2000/31/EC, stricter rules on targeted advertising and micro-targeting are needed, in favour of less intrusive forms of advertising that do not require extensive tracking of the interaction and behaviour of recipients of the service. Therefore, providers of information society services may only deliver and display online advertising to a recipient or a group of recipients of the service when this is done based on contextual information, such as keywords or metadata. Providers should not deliver and display online advertising to a recipient or a clearly identifiable group of recipients of the service that is based on personal or inferred data relating to the recipients or groups of recipients. Where providers deliver and display advertisement, they should be required to ensure that the recipients of the service have certain individualised information necessary for them to understand why and on whose behalf the advertisement is displayed, including sponsored content and paid promotion.
2021/07/08
Committee: IMCO
Amendment 214 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content and activities, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including unsafe and counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through advertising, recommender systems or through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, freedoms and principles as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to protection of personal data, the right to non- discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech, circumventing applicable laws or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of citizens including minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/06/10
Committee: LIBE
Amendment 219 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.deleted
2021/06/10
Committee: LIBE
Amendment 227 #
Proposal for a regulation
Recital 60
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.deleted
2021/06/10
Committee: LIBE
Amendment 230 #
Proposal for a regulation
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.deleted
2021/06/10
Committee: LIBE
Amendment 245 #
Proposal for a regulation
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.deleted
2021/06/10
Committee: LIBE
Amendment 248 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.deleted
2021/06/10
Committee: LIBE
Amendment 250 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self- regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan.deleted
2021/06/10
Committee: LIBE
Amendment 252 #
Proposal for a regulation
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross- border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.deleted
2021/06/10
Committee: LIBE
Amendment 256 #
Proposal for a regulation
Recital 79
(79) In the course of the exercise of those powers, the competent authorities should comply with the applicable national rules regarding procedures and matters such as the need for a prior judicial authorisation to enter certain premises and legal professional privilege. Those provisions should in particular ensure respect for the fundamental rights to an effective remedy and to a fair trial, including the rights of defence, and, the right to respect for private life. In this regard, the guarantees provided for in relation to the proceedings of the Commission pursuant to this Regulation could serve as an appropriate point of reference. A prior, fair and impartial procedure should be guaranteed before taking any final decision, including the right to be heard of the persons concerned, and the right to have access to the file, while respecting confidentiality and professional and business secrecy, as well as the obligation to give meaningful reasons for the decisions. This should not preclude the taking of measures, however, in duly substantiated cases of urgency and subject to appropriate conditions and procedural arrangements. The exercise of powers should also be proportionate to, inter alia the nature and the overall actual or potential harm caused by the infringement or suspected infringement. The competent authorities should in principle take all relevant facts and circumstances of the case into account, including information gathered by competent authorities in other Member States.
2021/06/10
Committee: LIBE
Amendment 258 #
Proposal for a regulation
Recital 85
(85) Where a Digital Services Coordinator requests another Digital Services Coordinator to take action, the requesting Digital Services Coordinator, or the Board in case it issued a recommendation to assess issues involving more than three Member States, should be able to refer the matter to the Commission in case of any disagreement as to the assessments or the measures taken or proposed or a failure to adopt any measures. The Commission, on the basis of the information made available by the concerned authorities, should accordingly be able to request the competent Digital Services Coordinator to re-assess the matter and take the necessary measures to ensure compliance within a defined time period. This possibility is without prejudice to the Commission’s general duty to oversee the application of, and where necessary enforce, Union law under the control of the Court of Justice of the European Union in accordance with the Treaties. A failure by the Digital Services Coordinator of establishment to take any measures pursuant to such a request may also lead to the Commission’Board's intervention under Section 3 of Chapter IV of this Regulation, where the suspected infringer is a very large online platform
2021/06/10
Committee: LIBE
Amendment 259 #
Proposal for a regulation
Recital 87
(87) In view of the particular challenges that may emerge in relation to assessing and ensuring a very large online platform’s compliance, for instance relating to the scale or complexity of a suspected infringement or the need for particular expertise or capabilities at Union level, Digital Services Coordinators should have the possibility to request, on a voluntary basis, the Commission the Board to intervene and exercise its investigatory and enforcement powers under this Regulation.
2021/06/10
Committee: LIBE
Amendment 260 #
Proposal for a regulation
Recital 89
(89) The Board should contribute to achieving a common Union perspective on the consistentand consistent Union application of this Regulation and to cooperation among competent authorities, including by advising the Commission and the Digital Services Coordinators about appropriate investigation and enforcement measures, in particular vis à vis very large online platforms. The Board should also contribute to the drafting of relevant templates and codes of conduct and analyse emerging general trends in the development of digital services in the Union.
2021/06/10
Committee: LIBE
Amendment 261 #
Proposal for a regulation
Recital 90
(90) For that purpose, the Board should be able to adopt decisions, opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation.
2021/06/10
Committee: LIBE
Amendment 263 #
Proposal for a regulation
Recital 92
(92) The Commission, through the Chair, should participate in the Board without voting rights. Through the Chair, the Commission should ensure that the agenda of the meetings is set in accordance with the requests of the members of the Board as laid down in the rules of procedure and in compliance with the duties of the Board laid down in this Regulation.
2021/06/10
Committee: LIBE
Amendment 264 #
Proposal for a regulation
Recital 96
(96) Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission mayBoard shall, on its own initiative or upon advice of the Boardrequest, decide to further investigate the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the CommissionBoard should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also have such a possibility to intervene in cross-border situations where the Digital Services Coordinator of establishment did not take any measures despite the CommissionBoard’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform.
2021/06/10
Committee: LIBE
Amendment 265 #
Proposal for a regulation
Recital 97
(97) The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. Once the CommissionOnce the Board initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the CommissionBoard, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should be provide the Commissiond with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary.
2021/06/10
Committee: LIBE
Amendment 266 #
Proposal for a regulation
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the CommissionBoard should have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties.
2021/06/10
Committee: LIBE
Amendment 267 #
Proposal for a regulation
Recital 99
(99) In particular, the CommissionBoard should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The CommissionBoard should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the CommissionBoard should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The CommissionBoard should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The CommissionBoard should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the CommissionBoard’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers
2021/06/10
Committee: LIBE
Amendment 268 #
Proposal for a regulation
Recital 101
(101) The very large online platforms concerned and other persons subject to the exercise of the CommissionBoard’s powers whose interests may be affected by a decision should be given the opportunity of submitting their observations beforehand, and the decisions taken should be widely publicised. While ensuring the rights of defence of the parties concerned, in particular, the right of access to the file, it is essential that confidential information be protected. Furthermore, while respecting the confidentiality of the information, the CommissionBoard should ensure that any information relied on for the purpose of its decision is disclosed to an extent that allows the addressee of the decision to understand the facts and considerations that lead up to the decision.
2021/06/10
Committee: LIBE
Amendment 269 #
Proposal for a regulation
Recital 102
(102) In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within five years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure.deleted
2021/06/10
Committee: LIBE
Amendment 277 #
Proposal for a regulation
Article 1 – paragraph 2 – point b a (new)
(b a) protect minors making use of services falling under this Regulation.
2021/06/10
Committee: LIBE
Amendment 278 #
Proposal for a regulation
Article 1 – paragraph 3
3. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services.
2021/06/10
Committee: LIBE
Amendment 279 #
Proposal for a regulation
Article 1 – paragraph 4 a (new)
4 a. This Regulation shall not apply to questions relating to information society services covered by Regulation (EU) 2016/679 and Directive 2002/58/EC, including the liability of controllers and processors.
2021/06/10
Committee: LIBE
Amendment 283 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
— a significant number of users in one or more Member States; ordeleted
2021/06/10
Committee: LIBE
Amendment 290 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against direct or indirect remuneration specifically for promoting that information;
2021/06/10
Committee: LIBE
Amendment 293 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, rank, prioritise or curate in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
2021/06/10
Committee: LIBE
Amendment 300 #
Proposal for a regulation
Article 3 – paragraph 3
3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.deleted
2021/06/10
Committee: LIBE
Amendment 307 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
(a) does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; or
2021/06/10
Committee: LIBE
Amendment 314 #
Proposal for a regulation
Article 6
Voluntary own-initiative investigations Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.Article 6 deleted and legal compliance
2021/06/10
Committee: LIBE
Amendment 322 #
Proposal for a regulation
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. No provision of this Regulation shall be understood as mandating, requiring or recommending the use of automated decision-making, or the monitoring of the behaviour of a large number of natural persons.
2021/06/10
Committee: LIBE
Amendment 329 #
Proposal for a regulation
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevanta national judicial or administrative authoritiesy, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. This rule shall apply mutatis mutandis in respect of competent administrative authorities ordering online platforms to act against traders unlawfully promoting or offering products or services in the Union.
2021/06/10
Committee: LIBE
Amendment 341 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
— information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content;
2021/06/10
Committee: LIBE
Amendment 348 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
(b a) where the provider has its main establishment, or, if not established in the Union, its legal representative, in another Member State than the authority issuing the order, the territorial scope of the order is limited to the territory of the Member State issuing the order;
2021/06/10
Committee: LIBE
Amendment 350 #
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
(b b) (bb) if addressed to a provider that has its main establishment outside the Union, the territorial scope of the order, where Union law is infringed, is limited to the territory of the Union or, where national law is infringed, to the territory of the Member State issuing the order;
2021/06/10
Committee: LIBE
Amendment 353 #
Proposal for a regulation
Article 8 – paragraph 3
3. The Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the orders referred to in paragraph 1 to all other Digital Services Coordinators through the system established in accordance with Article 67.
2021/06/10
Committee: LIBE
Amendment 361 #
Proposal for a regulation
Article 9 – paragraph 1
1. PFor the purpose of preventing a serious threat to public security, providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevanta national judicial or administrative authoritiesy on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
2021/06/10
Committee: LIBE
Amendment 363 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offencprevent a serious threat to public security in conformity with applicable Union or national rules;
2021/06/10
Committee: LIBE
Amendment 368 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1 a (new)
- where the information sought constitutes personal data within the meaning of Article 4(1)of Regulation (EU) 2016/679 or Article 3(1) of Directive (EU) 2016/680, a justification that the transfer is in accordance with applicable data protection legislation;
2021/06/10
Committee: LIBE
Amendment 372 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
— information about redress mechanisms available to the provider and to the recipients of the service concerned;
2021/06/10
Committee: LIBE
Amendment 381 #
Proposal for a regulation
Article 9 – paragraph 3
3. The Digital Services Coordinator from the Member State of the national judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the order referred to in paragraph 1 to all Digital Services Coordinators through the system established in accordance with Article 67.
2021/06/10
Committee: LIBE
Amendment 396 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. A summary of the terms and conditions, setting out the most important points in concise, clear and unambiguous language shall also be publicly available.
2021/06/10
Committee: LIBE
Amendment 399 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1 a. The obligation to provide information under paragraph 1 is without prejudice to obligations under Articles 12- 14 of Regulation 2016/679.
2021/06/10
Committee: LIBE
Amendment 400 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objectivefair, transparent, diligent, non-discriminatory , predictable, necessary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, or any other measure in accordance with this Regulation, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
2021/06/10
Committee: LIBE
Amendment 405 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2 a. Where a provision contained in the terms and conditions does not comply with Union or Member State law, or is in violation of the Charter, it shall not be binding upon the recipient of the service.
2021/06/10
Committee: LIBE
Amendment 412 #
Proposal for a regulation
Article 2 b (new)
Article 2 b Targeting of digital advertising 1. Providers of information society services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of determining the recipients to whom advertisements are displayed. 2. This provision shall not prevent information society services from determining the recipients to whom advertisements are displayed on the basis of contextual information such as keywords, the language setting communicated by the device of the recipient or the geographical region of the recipients to whom an advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of one or more natural persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or persons.
2021/07/19
Committee: JURI
Amendment 418 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, the action taken and the average time needed for taking the action specified in those orders;
2021/06/10
Committee: LIBE
Amendment 426 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/06/10
Committee: LIBE
Amendment 430 #
Proposal for a regulation
Article 13 a (new)
Article 13 a Targeting of digital advertising 1. Providers of intermediary services shall not collect or process personal data as defined by Regulation (EU) 2016/679 for the purpose of showing digital advertising. 2. This provision shall not prevent intermediary services from displaying targeted digital advertising based on contextual information such as keywords, the language setting communicated by the device of the recipient or the digital location where the advertisement is displayed. 3. The use of the contextual information referred to in paragraph 2 shall only be permissible if it does not allow for the direct or, by means of combining it with other information, indirect identification of a natural person or a clearly identifiable group of recipients/persons, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
2021/06/10
Committee: LIBE
Amendment 443 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.deleted
2021/06/10
Committee: LIBE
Amendment 454 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for thate processing of notices or decision-making, they shall include information on such use in the notification referred to in paragraph 4. This shall include meaningful information about the procedure followed, the technology used and the criteria and reasoning supporting the decision, as well as the logic involved in the automated decision-making
2021/06/10
Committee: LIBE
Amendment 472 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means in taking the decision, including whereand in any case whether the decision was taken in respect of content detected or identified using automated means; this shall include meaningful information about the procedure followed, the technology used and the criteria and reasoning supporting the decision, as well as the logic involved in the automated decision-making
2021/06/10
Committee: LIBE
Amendment 482 #
Proposal for a regulation
Article 15 a (new)
Article 15 a Content moderation 1. Providers of hosting services shall not use ex ante control measures based on automated tools or upload-filtering of content for content moderation. Where providers of hosting services otherwise use automated tools for content moderation, they shall ensure that qualified staff decide on any action to betaken and that legal content which does not infringe the terms and conditions set out by the providers is not affected. The provider shall ensure that adequate initial and ongoing training on the applicable legislation and international human rights standards, as well as appropriate working conditions, are provided to staff, and that, where necessary, they are given the opportunity to seek professional support, qualified psychological assistance and qualified legal advice. This paragraph shall not apply to moderating information which has most likely been provided by automated tools. 2. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory, diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service as enshrined in the Charter.
2021/06/10
Committee: LIBE
Amendment 484 #
Proposal for a regulation
Article 16 – title
Exclusion for micro and small enterprises
2021/06/10
Committee: LIBE
Amendment 487 #
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/06/10
Committee: LIBE
Amendment 494 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or ,disable or restrict access to the information;
2021/06/10
Committee: LIBE
Amendment 499 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall publicly disclose the rules of procedure of their internal complaint handling system in their terms and conditions and shall present them to recipients of the service in a clear, user-friendly and easily accessible manner.
2021/06/10
Committee: LIBE
Amendment 510 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means.
2021/06/10
Committee: LIBE
Amendment 517 #
Proposal for a regulation
Article 18 – paragraph 2 – point a
(a) it is impartial and independent of online platforms and individual recipients of the service provided by the online platforms;
2021/06/10
Committee: LIBE
Amendment 518 #
Proposal for a regulation
Article 18 – paragraph 2 – point a a (new)
(a a) it or its representatives are remunerated in a way that is not linked to the outcome of the procedure;
2021/06/10
Committee: LIBE
Amendment 525 #
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
3. If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient shall not be required to pay or reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
2021/06/10
Committee: LIBE
Amendment 526 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof. Out-of-court dispute settlement procedures should preferably be free of charge for the consumer. In the event that costs are applied, the procedure should be accessible, attractive and inexpensive for consumers. To that end, costs should not exceed a nominal fee.
2021/06/10
Committee: LIBE
Amendment 551 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices or notices regarding legal content through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
2021/06/10
Committee: LIBE
Amendment 555 #
1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.deleted
2021/06/10
Committee: LIBE
Amendment 567 #
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
3. Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following:
2021/06/10
Committee: LIBE
Amendment 570 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
2021/06/10
Committee: LIBE
Amendment 575 #
Proposal for a regulation
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.
2021/06/10
Committee: LIBE
Amendment 579 #
Proposal for a regulation
Article 21 – paragraph 1
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take placef a person is imminent, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.
2021/06/10
Committee: LIBE
Amendment 583 #
Proposal for a regulation
Article 21 – paragraph 2 – introductory part
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol.
2021/06/10
Committee: LIBE
Amendment 590 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
2021/06/10
Committee: LIBE
Amendment 592 #
Proposal for a regulation
Article 23 – paragraph 1 – point c
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied., including human review. This shall include meaningful information about the procedure followed, the technology used and the criteria and reasoning supporting decisions, as well as the logic involved in the automated decision-making
2021/06/10
Committee: LIBE
Amendment 614 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 4.5 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3.
2021/06/10
Committee: LIBE
Amendment 615 #
Proposal for a regulation
Article 25 – paragraph 2
2. The Commission shall adopt delegated acts in accordance with Article 69 to adjust the number of average monthly recipients of the service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least with 5 % in relation to its population in 2020 or, after adjustment by means of a delegated act, of its population in the year in which the latest delegated act was adopted. In that case, it shall adjust the number so that it corresponds to 10% of the Union’s population in the year in which it adopts the delegated act, rounded up or down to allow the number to be expressed in millions.
2021/06/10
Committee: LIBE
Amendment 618 #
Proposal for a regulation
Article 26
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks: (a) the dissemination of illegal content through their services; (b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively; (c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. 2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.Article 26 deleted Risk assessment
2021/06/10
Committee: LIBE
Amendment 642 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shallmay put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26easures, in order to address the dissemination of illegal content through their services. Such measures may include, where applicable:
2021/06/10
Committee: LIBE
Amendment 649 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of advertisements in association with the service they provide;
2021/06/10
Committee: LIBE
Amendment 650 #
Proposal for a regulation
Article 27 – paragraph 1 – point c
(c) reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic risk;.
2021/06/10
Committee: LIBE
Amendment 653 #
Proposal for a regulation
Article 27 – paragraph 1 – point d
(d) initiating or adjusting cooperation with trusted flaggers in accordance with Article 19;deleted
2021/06/10
Committee: LIBE
Amendment 656 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
(e) initiating or adjusting cooperation with other online platforms through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively.deleted
2021/06/10
Committee: LIBE
Amendment 663 #
Proposal for a regulation
Article 27 – paragraph 2
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which shall include the following: (a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; (b) best practices for very large online platforms to mitigate the systemic risks identified.deleted
2021/06/10
Committee: LIBE
Amendment 674 #
Proposal for a regulation
Article 27 – paragraph 3
3. The CommissionBoard, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
2021/06/10
Committee: LIBE
Amendment 677 #
Proposal for a regulation
Article 28
1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following: (a) the obligations set out in Chapter III; (b) any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37. 2. Audits performed pursuant to paragraph 1 shall be performed by organisations which: (a) are independent from the very large online platform concerned; (b) have proven expertise in the area of risk management, technical competence and capabilities; (c) have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards. 3. The organisations that perform the audits shall establish an audit report for each audit. The report shall be in writing and include at least the following: (a) the name, address and the point of contact of the very large online platform subject to the audit and the period covered; (b) the name and address of the organisation performing the audit; (c) a description of the specific elements audited, and the methodology applied; (d) a description of the main findings drawn from the audit; (e) an audit opinion on whether the very large online platform subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, either positive, positive with comments or negative; (f) where the audit opinion is not positive, operational recommendations on specific measures to achieve compliance. 4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non-compliance identified.Article 28 deleted Independent audit
2021/06/10
Committee: LIBE
Amendment 708 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, meaningful information as to the logic involved, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the. Online platforms shall ensure consumers are not profiled by default, unless consumers genuinely opt-in, in line with the requirements established under Regulation (EU) 2016/679. Online platforms shall not subvert or impair consumers’ autonomy, decision-making, or choice via the structure, function or meaningner of Article 4 (4) of Regulation (EU) 2016/679operation of their online interface or any part thereof.
2021/06/10
Committee: LIBE
Amendment 722 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2 a. Very large online platforms that use recommender systems shall, by default, allow the recipient of the service to have information presented to them in chronological order only or, alternatively, where technically possible, to use third- party recommender systems.
2021/06/10
Committee: LIBE
Amendment 724 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
2 b. By way of derogation from Article 16, this Article shall apply to all online platforms, regardless of their size.
2021/06/10
Committee: LIBE
Amendment 725 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/06/10
Committee: LIBE
Amendment 726 #
Proposal for a regulation
Article 30 – paragraph 2 – point a
(a) the content of the advertisement;
2021/06/10
Committee: LIBE
Amendment 731 #
Proposal for a regulation
Article 30 – paragraph 2 – point d
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose;the selected contexts in which the ad was displayed
2021/06/10
Committee: LIBE
Amendment 733 #
Proposal for a regulation
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the requestout delay, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
2021/06/10
Committee: LIBE
Amendment 741 #
Proposal for a regulation
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1).
2021/06/10
Committee: LIBE
Amendment 744 #
Proposal for a regulation
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate.
2021/06/10
Committee: LIBE
Amendment 746 #
Proposal for a regulation
Article 2 a (new)
Article 2a 1. Providers of information society services shall only deliver and display advertising that is based on contextual information such as keywords, language context, or the approximate geographical region of the recipient of the service to whom an advertisement is delivered or displayed. 2. The use of the contextual information referred to in paragraph 1 shall only be permissible if the advertisement is delivered in real time, that related data are not stored and that it does not involve the direct or, by means of combining it with other information, indirect identification of a natural person or group of persons, in particular by reference to an identifier such as a name, an identification number, precise location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person or group of persons. 3. Providers of information society services that deliver and display advertising on their online interfaces or on third-party services shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: (a) that the information displayed is an advertisement; (b) the natural or legal person on whose behalf the advertisement is displayed; (c) detailed information about the main parameters used to determine the recipient to whom the advertisement is delivered and displayed.
2021/07/08
Committee: IMCO
Amendment 748 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/06/10
Committee: LIBE
Amendment 752 #
Proposal for a regulation
Article 31 – paragraph 6 – introductory part
6. Within 153 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons:
2021/06/10
Committee: LIBE
Amendment 754 #
Proposal for a regulation
Article 31 – paragraph 6 – point b
(b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets.deleted
2021/06/10
Committee: LIBE
Amendment 756 #
Proposal for a regulation
Article 31 – paragraph 7
7. Requests for amendment pursuant to point (b) of paragraph 6 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request. The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request.deleted
2021/06/10
Committee: LIBE
Amendment 761 #
Proposal for a regulation
Article 32 – paragraph 3 – point a
(a) cooperating with the Digital Services Coordinator of establishment and the CommissionBoard for the purpose of this Regulation;
2021/06/10
Committee: LIBE
Amendment 762 #
Proposal for a regulation
Article 32 – paragraph 3 – point b
(b) organising and supervising the very large online platform’s activities relating to the independent audit pursuant to Article 28;deleted
2021/06/10
Committee: LIBE
Amendment 764 #
Proposal for a regulation
Article 33 – paragraph 2
2. In addition to the reports provided for in Article 13, very large online platforms shall make publicly available and transmit to the Digital Services Coordinator of establishment and the Commission, at least once a year and within 30 days following the adoption of the audit implementing report provided for in Article 28(4): (a) a report setting out the results of the risk assessment pursuant to Article 26; (b) the related risk mitigation measures identified and implemented pursuant to Article 27; (c) the audit report provided for in Article 28(3); (d) the audit implementation report provided for in Article 28(4).deleted
2021/06/10
Committee: LIBE
Amendment 768 #
Proposal for a regulation
Article 33 – paragraph 3
3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports.deleted
2021/06/10
Committee: LIBE
Amendment 771 #
Proposal for a regulation
Article 34 – paragraph 1 – point d
(d) auditing of very large online platforms pursuant to Article 28;deleted
2021/06/10
Committee: LIBE
Amendment 773 #
Proposal for a regulation
Article 35
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. 2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. 3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. 4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. 5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.Article 35 deleted Codes of conduct
2021/06/10
Committee: LIBE
Amendment 781 #
Proposal for a regulation
Article 36
Codes of conduct for online advertising 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: (a) the transmission of information held by providers of online advertising intermediaries to recipients of the service with regard to requirements set in points (b) and (c) of Article 24; (b) the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date.Article 36 deleted
2021/06/10
Committee: LIBE
Amendment 787 #
Proposal for a regulation
Article 37
[...]deleted
2021/06/10
Committee: LIBE
Amendment 789 #
Proposal for a regulation
Article 38 – paragraph 2 – introductory part
2. Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union.
2021/06/10
Committee: LIBE
Amendment 791 #
Proposal for a regulation
Article 38 – paragraph 2 – subparagraph 1
For that purpose, Digital Services Coordinators shall cooperate with each other, other national competent authorities, the Board and the Commission and the Board, without prejudice to the possibility for Member States to provide for regular exchanges of views with other authorities where relevant for the performance of the tasks of those other authorities and of the Digital Services Coordinator.
2021/06/10
Committee: LIBE
Amendment 797 #
Proposal for a regulation
Article 40 – paragraph 4
4. Paragraphs 1, 2 and 3 are without prejudice to the second subparagraph of Article 50(4) and the second subparagraph of Article 51(2) and the tasks and powers of the CommissionBoard under Section 3.
2021/06/10
Committee: LIBE
Amendment 800 #
Proposal for a regulation
Article 41 – paragraph 2 – point a
(a) the power to accept the commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding;deleted
2021/06/10
Committee: LIBE
Amendment 803 #
Proposal for a regulation
Article 41 – paragraph 3 – point b
(b) where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, or that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving an imminent threat to the life or safetyf a person ofr persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.
2021/06/10
Committee: LIBE
Amendment 807 #
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority, and inform the person who lodged the complaint thereof.
2021/06/10
Committee: LIBE
Amendment 815 #
Proposal for a regulation
Article 45 – paragraph 5
5. Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the CommissionBoard, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4.
2021/06/10
Committee: LIBE
Amendment 816 #
Proposal for a regulation
Article 45 – paragraph 6
6. The Commission shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Board.deleted
2021/06/10
Committee: LIBE
Amendment 817 #
Proposal for a regulation
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request.deleted
2021/06/10
Committee: LIBE
Amendment 820 #
Proposal for a regulation
Article 46 – title
Joint investigations and requests for CommissionBoard intervention
2021/06/10
Committee: LIBE
Amendment 821 #
Proposal for a regulation
Article 46 – paragraph 2
2. Where a Digital Services Coordinator of establishment has reasons to suspect that a very large online platform infringed this Regulation, it may request the CommissionBoard to take the necessary investigatory and enforcement measures to ensure compliance with this Regulation in accordance with Section 3. Such a request shall contain all information listed in Article 45(2) and set out the reasons for requesting the CommissionBoard to intervene.
2021/06/10
Committee: LIBE
Amendment 822 #
Proposal for a regulation
Article 47 – paragraph 1
1. AIn independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary services namedorder to ensure the consistent application of this Regulation, the ‘European Board for Digital Services’ (the ‘Board’) is hereby established as a body of the Union. The Board shall have legal personality.
2021/06/10
Committee: LIBE
Amendment 823 #
Proposal for a regulation
Article 47 – paragraph 1 a (new)
1 a. The board shall act independently when performing its tasks or exercising its powers.
2021/06/10
Committee: LIBE
Amendment 824 #
Proposal for a regulation
Article 47 – paragraph 1 b (new)
1 b. The board shall be represented by its Chair.
2021/06/10
Committee: LIBE
Amendment 825 #
Proposal for a regulation
Article 47 – paragraph 2 – introductory part
2. The Board shall advise the Digital Services Coordinators and the Commistake decisions in accordance with this Regulation to achieve the following objectives:
2021/06/10
Committee: LIBE
Amendment 826 #
Proposal for a regulation
Article 47 – paragraph 2 – point a
(a) Contributing toEnsuring the consistent application across the Union of this Regulation and effective cooperation of the Digital Services Coordinators and the Commission with regard to matters covered by this Regulation;
2021/06/10
Committee: LIBE
Amendment 827 #
Proposal for a regulation
Article 47 – paragraph 2 – point b
(b) coordinating and contributing toproviding guidance and analysis tof the Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by this Regulation;
2021/06/10
Committee: LIBE
Amendment 828 #
Proposal for a regulation
Article 47 – paragraph 2 – point c
(c) assisting the Digital Services Coordinators and the Commission in the supervision of very large online platforms.
2021/06/10
Committee: LIBE
Amendment 829 #
Proposal for a regulation
Article 48 – paragraph 2 a (new)
2 a. The Board shall elect a chair and two deputy chairs from amongst its members by simple majority.
2021/06/10
Committee: LIBE
Amendment 830 #
Proposal for a regulation
Article 48 – paragraph 2 b (new)
2 b. The term of office of the Chair and of the deputy chairs shall be five years and shall be renewable once.
2021/06/10
Committee: LIBE
Amendment 831 #
Proposal for a regulation
Article 48 – paragraph 3
3. The Board shall be chaired by the Commission. The Commissionhair. The Chair shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure.
2021/06/10
Committee: LIBE
Amendment 832 #
Proposal for a regulation
Article 48 – paragraph 5
5. The Board may invite experts and observers to attend its meetings, and mayshall cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available.
2021/06/10
Committee: LIBE
Amendment 833 #
Proposal for a regulation
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure, following the consent of the Commission. by a two-thirds majority of its members and shall organise its own operational arrangements
2021/06/10
Committee: LIBE
Amendment 835 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
(d) advise the Commissiondecide to take the measures referred to in Article 51, and, where requested by the Commission, adopt opinions on draft Commission adopt measures concerning very large online platforms in accordance with this Regulation;
2021/06/10
Committee: LIBE
Amendment 838 #
Proposal for a regulation
Article 50 – paragraph 1 – introductory part
1. Where the Digital Services Coordinator of establishment adopts a decision finding that a very large online platform has infringed any of the provisions of Section 4 of Chapter III, it shall make use of the enhanced supervision system laid down in this Article. It shall take utmost account of any opinion and recommendation of the Comm The Board, acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, shall, where it has reasons to suspect that a very large online platform infringed any of those provission and the Board pursuant tos, make use of the enhanced supervision system laid down in this Article.
2021/06/10
Committee: LIBE
Amendment 839 #
Proposal for a regulation
Article 50 – paragraph 2
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35.
2021/06/10
Committee: LIBE
Amendment 840 #
Proposal for a regulation
Article 50 – paragraph 3 – introductory part
3. Within one month following receipt of the action plan, the Board shall communicate its opinion on the action plan to the Digital Services Coordinator of establishment. Within one month following receipt of that opinion, that Digital Services Coordinator shall decide whether the action plan is appropriate to terminate or remedy the infringement.
2021/06/10
Committee: LIBE
Amendment 841 #
Proposal for a regulation
Article 50 – paragraph 3 – subparagraph 1
Where the Digital Services Coordinator of establishment has concerns on the ability of the measures to terminate or remedy the infringement, it may request the very large online platform concerned to subject itself to an additional, independent audit to assess the effectiveness of those measures in terminating or remedying the infringement. In that case, that platform shall send the audit report to that Digital Services Coordinator, the Commission and the Board within fourone months from the decision referred to in the first subparagraph. When requesting such an additional audit, the Digital Services Coordinator may specify a particular audit organisation that is to carry out the audit, at the expense of the platform concerned, selected on the basis of criteria set out in Article 28(2).
2021/06/10
Committee: LIBE
Amendment 842 #
Proposal for a regulation
Article 50 – paragraph 4 – introductory part
4. The Digital Services Coordinator of establishment shall communicate to the Commission, the Board andBoard shall communicate to the very large online platform concerned its views as to whether the very large online platform has terminated or remedied the infringement and the reasons thereof. It shall do so within the following time periods, as applicable:
2021/06/10
Committee: LIBE
Amendment 843 #
Proposal for a regulation
Article 50 – paragraph 4 – point a
(a) within one month from the receipt of the audit report referred to in the second subparagraph of paragraph 3, where such an audit was performdeleted;
2021/06/10
Committee: LIBE
Amendment 844 #
Proposal for a regulation
Article 50 – paragraph 4 – point b
(b) within threone months from the decision on the action plan referred to in the first subparagraph of paragraph 3, where no such audit was performed;
2021/06/10
Committee: LIBE
Amendment 845 #
Proposal for a regulation
Article 50 – paragraph 4 – subparagraph 1
Pursuant to that communication, the Digital Services Coordinator of establishment shall no longer be entitled to take any investigatory or enforcement measures in respect of the relevant conduct by the very large online platform concerned, without prejudice to Article 66 or any other measures that it may take at the request of the CommissionBoard.
2021/06/10
Committee: LIBE
Amendment 846 #
Proposal for a regulation
Article 51 – title
Intervention by the CommissionBoard and opening of proceedings
2021/06/10
Committee: LIBE
Amendment 848 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The CommissionBoard, acting either upon the Board’s recommendation or on its own initiative after consulting the Boardf at least three Digital Services Coordinators of destination or on its own initiative, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/06/10
Committee: LIBE
Amendment 849 #
Proposal for a regulation
Article 51 – paragraph 1 – point a
(a) is suspected of having infringed any of the provisions of this Regulation and the Digital Services Coordinator of establishment did not take any investigatory or enforcement measures, pursuant to the request of the Commission referred to in Article 45(7), upon the expiry of the time period set in that request;
2021/06/10
Committee: LIBE
Amendment 850 #
Proposal for a regulation
Article 51 – paragraph 1 – point b
(b) is suspected of having infringed any of the provisions of this Regulation and the Digital Services Coordinator of establishment requested the CommissionBoard to intervene in accordance with Article 46(2), upon the reception of that request;
2021/06/10
Committee: LIBE
Amendment 851 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2. Where the CommissionBoard decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/06/10
Committee: LIBE
Amendment 852 #
Proposal for a regulation
Article 51 – paragraph 3 – introductory part
3. The Digital Services Coordinator referred to in Articles 45(7), 46(2) and 50(1), as applicable, shall, without undue delay upon being informed, transmit to the Commission:
2021/06/10
Committee: LIBE
Amendment 853 #
Proposal for a regulation
Article 51 – paragraph 3 – point c
(c) any other information in the possession of that Digital Services Coordinator that may be relevant to the proceedings initiated by the CommissionBoard.
2021/06/10
Committee: LIBE
Amendment 854 #
Proposal for a regulation
Article 51 – paragraph 4
4. The Board, and the Digital Services Coordinators making the request referred to in Article 45(1), shall, without undue delay upon being informed, transmit to the Commission any information in their possession that may be relevant to the proceedings initiated by the Commission.deleted
2021/06/10
Committee: LIBE
Amendment 857 #
Proposal for a regulation
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the CommissionBoard may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period.
2021/06/10
Committee: LIBE
Amendment 858 #
Proposal for a regulation
Article 52 – paragraph 2
2. When sending a simple request for information to the very large online platform concerned or other person referred to in Article 52(1), the CommissionBoard shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which the information is to be provided, and the penalties provided for in Article 59 for supplying incorrect or misleading information.
2021/06/10
Committee: LIBE
Amendment 859 #
Proposal for a regulation
Article 52 – paragraph 3
3. Where the CommissionBoard requires the very large online platform concerned or other person referred to in Article 52(1) to supply information by decision, it shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which it is to be provided. It shall also indicate the penalties provided for in Article 59 and indicate or impose the periodic penalty payments provided for in Article 60. It shall further indicate the right to have the decision reviewed by the Court of Justice of the European Union.
2021/06/10
Committee: LIBE
Amendment 860 #
Proposal for a regulation
Article 52 – paragraph 5
5. At the request of the CommissionBoard, the Digital Services Coordinators and other competent authorities shall provide the CommissionBoard with all necessary information to carry out the tasks assigned to it under this Section.
2021/06/10
Committee: LIBE
Amendment 861 #
In order to carry out the tasks assigned to it under this Section, the CommissionBoard may interview any natural or legal person which consents to being interviewed for the purpose of collecting information, relating to the subject-matter of an investigation, in relation to the suspected infringement or infringement, as applicable.
2021/06/10
Committee: LIBE
Amendment 862 #
Proposal for a regulation
Article 54 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the CommissionBoard may conduct on-site inspections at the premises of the very large online platform concerned or other person referred to in Article 52(1).
2021/06/10
Committee: LIBE
Amendment 863 #
Proposal for a regulation
Article 54 – paragraph 2
2. On-site inspections may also be carried out with the assistance of auditors or experts appointed by the CommissionBoard pursuant to Article 57(2).
2021/06/10
Committee: LIBE
Amendment 864 #
Proposal for a regulation
Article 54 – paragraph 3
3. During on-site inspections the CommissionBoard and auditors or experts appointed by it may require the very large online platform concerned or other person referred to in Article 52(1) to provide explanations on its organisation, functioning, IT system, algorithms, data- handling and business conducts. The CommissionBoard and auditors or experts appointed by it may address questions to key personnel of the very large online platform concerned or other person referred to in Article 52(1).
2021/06/10
Committee: LIBE
Amendment 865 #
Proposal for a regulation
Article 54 – paragraph 4
4. The very large online platform concerned or other person referred to in Article 52(1) is required to submit to an on-site inspection ordered by decision of the CommissionBoard. The decision shall specify the subject matter and purpose of the visit, set the date on which it is to begin and indicate the penalties provided for in Articles 59 and 60 and the right to have the decision reviewed by the Court of Justice of the European Union.
2021/06/10
Committee: LIBE
Amendment 866 #
Proposal for a regulation
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the CommissionBoard may, by decision, order interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement.
2021/06/10
Committee: LIBE
Amendment 867 #
Proposal for a regulation
Article 56
1. If, during proceedings under this Section, the very large online platform concerned offers commitments to ensure compliance with the relevant provisions of this Regulation, the Commission may by decision make those commitments binding on the very large online platform concerned and declare that there are no further grounds for action. 2. The Commission may, upon request or on its own initiative, reopen the proceedings: (a) where there has been a material change in any of the facts on which the decision was based; (b) where the very large online platform concerned acts contrary to its commitments; or (c) where the decision was based on incomplete, incorrect or misleading information provided by the very large online platform concerned or other person referred to in Article 52(1). 3. Where the Commission considers that the commitments offered by the very large online platform concerned are unable to ensure effective compliance with the relevant provisions of this Regulation, it shall reject those commitments in a reasoned decision when concluding the proceedings.Article 56 deleted Commitments
2021/06/10
Committee: LIBE
Amendment 869 #
Proposal for a regulation
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the CommissionBoard may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The CommissionBoard may also order that platform to provide access to, and explanations relating to, its databases and algorithms.
2021/06/10
Committee: LIBE
Amendment 870 #
Proposal for a regulation
Article 57 – paragraph 2
2. The actions pursuant to paragraph 1 may include the appointment of independent external experts and auditors to assist the CommissionBoard in monitoring compliance with the relevant provisions of this Regulation and to provide specific expertise or knowledge to the CommissionBoard.
2021/06/10
Committee: LIBE
Amendment 871 #
Proposal for a regulation
Article 58 – paragraph 1 – introductory part
1. The CommissionBoard shall adopt a non- compliance decision where it finds that the very large online platform concerned does not comply with one or more of the following:
2021/06/10
Committee: LIBE
Amendment 872 #
Proposal for a regulation
Article 58 – paragraph 1 – point c
(c) commitments made binding pursuant to Article 56,deleted
2021/06/10
Committee: LIBE
Amendment 873 #
Proposal for a regulation
Article 58 – paragraph 2
2. Before adopting the decision pursuant to paragraph 1, the CommissionBoard shall communicate its preliminary findings to the very large online platform concerned. In the preliminary findings, the CommissionBoard shall explain the measures that it considers taking, or that it considers that the very large online platform concerned should take, in order to effectively address the preliminary findings.
2021/06/10
Committee: LIBE
Amendment 874 #
Proposal for a regulation
Article 58 – paragraph 3
3. In the decision adopted pursuant to paragraph 1 the CommissionBoard shall order the very large online platform concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within a reasonable time period and to provide information on the measures that that platform intends to take to comply with the decision.
2021/06/10
Committee: LIBE
Amendment 875 #
Proposal for a regulation
Article 58 – paragraph 4
4. The very large online platform concerned shall provide the CommissionBoard with a description of the measures it has taken to ensure compliance with the decision pursuant to paragraph 1 upon their implementation.
2021/06/10
Committee: LIBE
Amendment 876 #
Proposal for a regulation
Article 58 – paragraph 5
5. Where the CommissionBoard finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision.
2021/06/10
Committee: LIBE
Amendment 877 #
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
1. In the decision pursuant to Article 58, the CommissionBoard may impose on the very large online platform concerned fines not exceeding 6% of its total turnover in the preceding financial year where it finds that that platform, intentionally or negligently:
2021/06/10
Committee: LIBE
Amendment 878 #
Proposal for a regulation
Article 59 – paragraph 1 – point c
(c) fails to comply with a voluntary measure made binding by a decision pursuant to Articles 56.deleted
2021/06/10
Committee: LIBE
Amendment 879 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
2. The CommissionBoard may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 14% of the total turnover in the preceding financial year, where they intentionally or negligently:
2021/06/10
Committee: LIBE
Amendment 880 #
Proposal for a regulation
Article 59 – paragraph 2 – point b
(b) fail to rectify within the time period set by the CommissionBoard, incorrect, incomplete or misleading information given by a member of staff, or fail or refuse to provide complete information;
2021/06/10
Committee: LIBE
Amendment 881 #
Proposal for a regulation
Article 59 – paragraph 3
3. Before adopting the decision pursuant to paragraph 2, the CommissionBoard shall communicate its preliminary findings to the very large online platform concerned or other person referred to in Article 52(1).
2021/06/10
Committee: LIBE
Amendment 882 #
Proposal for a regulation
Article 59 – paragraph 4
4. In fixing the amount of the fine, the CommissionBoard shall have regard to the nature, gravity, duration and recurrence of the infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings.
2021/06/10
Committee: LIBE
Amendment 883 #
Proposal for a regulation
Article 60 – paragraph 1 – introductory part
1. The CommissionBoard may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to:
2021/06/10
Committee: LIBE
Amendment 884 #
Proposal for a regulation
Article 60 – paragraph 1 – point d
(d) comply with commitments made legally binding by a decision pursuant to Article 56(1);deleted
2021/06/10
Committee: LIBE
Amendment 885 #
Proposal for a regulation
Article 61 – paragraph 1
1. The powers conferred on the CommissionBoard by Articles 59 and 60 shall be subject to a limitation period of five years.
2021/06/10
Committee: LIBE
Amendment 886 #
Proposal for a regulation
Article 61 – paragraph 2
2. Time shall begin to run on the day on which the infringement is committeddecisions are taken. However, in the case of continuing or repeated infringements, time shall begin to run on the day on which the infringement ceases.
2021/06/10
Committee: LIBE
Amendment 887 #
Proposal for a regulation
Article 61 – paragraph 3 – introductory part
3. Any action taken by the CommissionBoard or by the Digital Services Coordinator for the purpose of the investigation or proceedings in respect of an infringement shall interrupt the limitation period for the imposition of fines or periodic penalty payments. Actions which interrupt the limitation period shall include, in particular, the following:
2021/06/10
Committee: LIBE
Amendment 888 #
Proposal for a regulation
Article 61 – paragraph 3 – point a
(a) requests for information by the CommissionBoard or by a Digital Services Coordinator;
2021/06/10
Committee: LIBE
Amendment 889 #
Proposal for a regulation
Article 61 – paragraph 3 – point c
(c) the opening of a proceeding by the CommissionBoard pursuant to Article 51(2).
2021/06/10
Committee: LIBE
Amendment 890 #
Proposal for a regulation
Article 61 – paragraph 4
4. Each interruption shall start time running afresh. However, the limitation period for the imposition of fines or periodic penalty payments shall expire at the latest on the day on which a period equal to twice the limitation period has elapsed without the CommissionBoard having imposed a fine or a periodic penalty payment. That period shall be extended by the time during which the limitation period is suspended pursuant to paragraph 5.
2021/06/10
Committee: LIBE
Amendment 891 #
Proposal for a regulation
Article 61 – paragraph 5
5. The limitation period for the imposition of fines or periodic penalty payments shall be suspended for as long as the decision of the CommissionBoard is the subject of proceedings pending before the Court of Justice of the European Union.
2021/06/10
Committee: LIBE
Amendment 892 #
Proposal for a regulation
Article 62 – paragraph 1
1. The power of the CommissionBoard to enforce decisions taken pursuant to Articles 59 and 60 shall be subject to a limitation period of five years.
2021/06/10
Committee: LIBE
Amendment 893 #
Proposal for a regulation
Article 62 – paragraph 3 – point b
(b) by any action of the CommissionBoard, or of a Member State acting at the request of the CommissionBoard, designed to enforce payment of the fine or periodic penalty payment.
2021/06/10
Committee: LIBE
Amendment 894 #
Proposal for a regulation
Article 63 – paragraph 1 – introductory part
1. Before adopting a decision pursuant to Articles 58(1), 59 or 60, the CommissionBoard shall give the very large online platform concerned or other person referred to in Article 52(1) the opportunity of being heard on:
2021/06/10
Committee: LIBE
Amendment 895 #
Proposal for a regulation
Article 63 – paragraph 1 – point a
(a) preliminary findings of the CommissionBoard, including any matter to which the Commission has taken objections; and
2021/06/10
Committee: LIBE
Amendment 896 #
Proposal for a regulation
Article 63 – paragraph 1 – point b
(b) measures that the CommissionBoard may intend to take in view of the preliminary findings referred to point (a).
2021/06/10
Committee: LIBE
Amendment 897 #
Proposal for a regulation
Article 63 – paragraph 2
2. The very large online platform concerned or other person referred to in Article 52(1) may submit their observations on the Commission’Board's preliminary findings within a reasonable time period set by the CommissionBoard in its preliminary findings, which may not be less than 14 days.
2021/06/10
Committee: LIBE
Amendment 898 #
Proposal for a regulation
Article 63 – paragraph 3
3. The CommissionBoard shall base its decisions only on objections on which the parties concerned have been able to comment.
2021/06/10
Committee: LIBE
Amendment 899 #
Proposal for a regulation
Article 63 – paragraph 4
4. The rights of defence of the parties concerned shall be fully respected in the proceedings. They shall be entitled to have access to the CommissionBoard's file under the terms of a negotiated disclosure, subject to the legitimate interest of the very large online platform concerned or other person referred to in Article 52(1) in the protection of their business secrets. The right of access to the file shall not extend to confidential information and internal documents of the CommissionBoard or Member States’ authorities. In particular, the right of access shall not extend to correspondence between the CommissionBoard and those authorities. Nothing in this paragraph shall prevent the CommissionBoard from disclosing and using information necessary to prove an infringement.
2021/06/10
Committee: LIBE
Amendment 900 #
Proposal for a regulation
Article 63 – paragraph 6
6. Without prejudice to the exchange and to the use of information referred to in Articles 51(3) and 52(5), the Commission, the Board, Member States’ authorities and their respective officials, servants and other persons working under their supervision,; and any other natural or legal person involved, including auditors and experts appointed pursuant to Article 57(2) shall not disclose information acquired or exchanged by them pursuant to this Section and of the kind covered by the obligation of professional secrecy.
2021/06/10
Committee: LIBE
Amendment 901 #
Proposal for a regulation
Article 64 – paragraph 1
1. The CommissionBoard shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed.
2021/06/10
Committee: LIBE
Amendment 902 #
Proposal for a regulation
Article 65 – paragraph 1 – introductory part
1. Where all powers pursuant to this Article to bring about the cessation of an infringement of this Regulation have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the Commission mayBoard shall request the Digital Services Coordinator of establishment of the very large online platform concerned to act pursuant to Article 41(3).
2021/06/10
Committee: LIBE
Amendment 903 #
Proposal for a regulation
Article 65 – paragraph 1 – subparagraph 1
Prior to making such request to the Digital Services Coordinator, the CommissionBoard shall invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing the measures it intends to request and identifying the intended addressee or addressees thereof.
2021/06/10
Committee: LIBE
Amendment 904 #
Proposal for a regulation
Article 65 – paragraph 2 – introductory part
2. Where the coherent application of this Regulation so requires, the CommissionBoard, acting on its own initiative, may submit written observations to the competent judicial authority referred to Article 41(3). With the permission of the judicial authority in question, it may also make oral observations.
2021/06/10
Committee: LIBE
Amendment 905 #
Proposal for a regulation
Article 65 – paragraph 2 – subparagraph 1
For the purpose of the preparation of its observations only, the CommissionBoard may request that judicial authority to transmit or ensure the transmission to it of any documents necessary for the assessment of the case.
2021/06/10
Committee: LIBE
Amendment 906 #
Proposal for a regulation
Article 66
Implementing acts relating to 1. In relation to the Commission intervention covered by this Section, the Commission may adopt implementing acts concerning the practical arrangements for: (a) the proceedings pursuant to Articles 54 and 57; (b) the hearings provided for in Article 63; (c) the negotiated disclosure of information provided for in Article 63. 2. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70. Before the adoption of any measures pursuant to paragraph 1, the Commission shall publish a draft thereof and invite all interested parties to submit their comments within the time period set out therein, which shall not be less than one month.Article 66 deleted Commission intervention
2021/06/10
Committee: LIBE