BETA

57 Amendments of Jessica POLFJÄRD related to 2020/0361(COD)

Amendment 140 #
Proposal for a regulation
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation and the competitiveness of European companies should not be hampered but instead be stimulated.
2021/09/10
Committee: ECON
Amendment 161 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers in connection with information relating to illegal content, products, services and activities. In particular, thatThe illegal nature of such content, products or services is defined by relevant Union law or national law in accordance with Union law. The concept should be understood, for example, to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/09/10
Committee: ECON
Amendment 174 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation, without prejudice to Article 6, in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/09/10
Committee: ECON
Amendment 186 #
Proposal for a regulation
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services, cloud services or search engines. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent thatwhere they qualify as ‘mere conduit’, ‘caching’ or hosting service.
2021/09/10
Committee: ECON
Amendment 193 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.
2021/09/10
Committee: ECON
Amendment 216 #
Proposal for a regulation
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. In this regard, the Commission and Digital Service Coordinators may work together on information and guidelines for the voluntary implementation of the provisions in this Regulation for micro or small enterprises. Furthermore, the Commission and Digital Services Coordinators are also encouraged to do so for medium enterprises, which while not benefitting from the liability exemptions in Section 3, may sometimes lack the legal resources necessary to ensure proper understanding and compliance with all provisions. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/09/10
Committee: ECON
Amendment 220 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. FSuch entities can also include businesses who have a vested interest in flagging counterfeit products of their brand thus ensuring the online consumer experience is safer and more reliable. Similarly, for intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/09/10
Committee: ECON
Amendment 225 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commitcontent manifestly related to a serious criminal offence involving a threat to the life or safety of persons, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcemrelevant competent authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/09/10
Committee: ECON
Amendment 230 #
Proposal for a regulation
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders on the platforms should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
2021/09/10
Committee: ECON
Amendment 232 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of some of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System.45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, tThe online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties or be liable for this information in case it proves to be inaccurate. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/09/10
Committee: ECON
Amendment 253 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. Such reinforcement could include the expansion and resource allocation to content moderation in languages other than English. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/09/10
Committee: ECON
Amendment 258 #
Proposal for a regulation
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform systematically does not comply with this Regulation or the commitments undertaken. A disclaimer of an opinion should be given where the auditor does not have enough information to conclude on an opinion due to the novelty of the issues audited.
2021/09/10
Committee: ECON
Amendment 301 #
Proposal for a regulation
Recital 100
(100) Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for systemic non-compliance with the relevant obligations and breach of the procedural rules, subject to appropriate limitation periods. A systematic infringement is a pattern of online harm that, when the individual harms are added up, constitutes an aggregation of systemic harm to active recipients of the service across three or more EU Member States.
2021/09/10
Committee: ECON
Amendment 327 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, includingthrough the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
2021/09/10
Committee: ECON
Amendment 328 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of anotherr functionality of another service or the principle service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation.
2021/09/10
Committee: ECON
Amendment 354 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3 a (new)
- the order is transmitted via secure channels established between the relevant national judicial or administrative authorities and the providers of intermediary services;
2021/09/10
Committee: ECON
Amendment 356 #
Proposal for a regulation
Article 8 – paragraph 2 – subparagraph 1 (new)
In extraordinary cases, where the intermediary service has reasonable doubt that the removal order is not legally sound, the intermediary service should have access to a mechanism to challenge the decision. This mechanism shall be established by the Digital Services Coordinators in coordination with the Board and the Commission.
2021/09/10
Committee: ECON
Amendment 359 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2 a (new)
- the order is transmitted via secure channels established between the relevant national judicial or administrative authorities and the providers of intermediary services.
2021/09/10
Committee: ECON
Amendment 361 #
Proposal for a regulation
Article 10 – paragraph 2
2. Providers of intermediary services shall make public to trusted flaggers as well as users in all Member States the information necessary to easily identify and communicate with their intermediary services' single points of contact.
2021/09/10
Committee: ECON
Amendment 378 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c) the content moderation engaged in athrough the providers’'s voluntary own- initiative investigations as per Article 6, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures;
2021/09/10
Committee: ECON
Amendment 394 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) where possible, a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
2021/09/10
Committee: ECON
Amendment 395 #
Proposal for a regulation
Article 14 – paragraph 2 – point c
(c) where possible, the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;
2021/09/10
Committee: ECON
Amendment 399 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 solely in respect of the specific item of information concerned, when the provider of hosting services can unequivocally identify the illegal nature of the content.
2021/09/10
Committee: ECON
Amendment 409 #
Proposal for a regulation
Article 15 – paragraph 2 – subparagraph 1 (new)
Where a provider of hosting services decides to not remove or disable access to specific items of information provided by the recipients of the service, detected through the mechanisms established in Article 14, it shall inform the user who notified the online platform of the content and where needed, the recipient of the decision without undue delay. The notification of such a decision can be done through automated means.
2021/09/10
Committee: ECON
Amendment 411 #
Proposal for a regulation
Article 15 – paragraph 4
4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal data.deleted
2021/09/10
Committee: ECON
Amendment 415 #
Proposal for a regulation
Article 16 – paragraph 1 – subparagraph 1 (new)
The Commission and Digital Service Coordinators may work together on information and guidelines for the voluntary implementation of the provisions in this Regulation for micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/09/10
Committee: ECON
Amendment 416 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide to all recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge,. Complaints can be filed against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/09/10
Committee: ECON
Amendment 420 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable or restrict access to the information;
2021/09/10
Committee: ECON
Amendment 421 #
Proposal for a regulation
Article 17 – paragraph 1 – subparagraph 1 (new)
Complaints can also be lodged against decisions made by the online platform to not remove, not disable, not suspend and not terminate access to accounts.
2021/09/10
Committee: ECON
Amendment 426 #
Proposal for a regulation
Article 17 – paragraph 3 – point a (new)
(a) Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is indeed illegal and is incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does warrant the suspension or termination of the service or the account, it shall also reserve its decision referred to in Paragraph 1 without undue delay.
2021/09/10
Committee: ECON
Amendment 438 #
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Commission or by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions:
2021/09/10
Committee: ECON
Amendment 439 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform except in the cases of businesses with a vested interest in flagging counterfeit products of their brand thus ensuring the online consumer experience is safer and more reliable;
2021/09/10
Committee: ECON
Amendment 443 #
Proposal for a regulation
Article 19 – paragraph 3
3. Digital Services Coordinators and the Commission shall communicate to the Commissioneach other and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2.
2021/09/10
Committee: ECON
Amendment 445 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinatorauthority that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
2021/09/10
Committee: ECON
Amendment 447 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinatorauthority that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
2021/09/10
Committee: ECON
Amendment 459 #
Proposal for a regulation
Article 21 – paragraph 2 – introductory part
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europolhas its main establishment or its legal representative and also transmit the information to Europol for appropriate follow up.
2021/09/10
Committee: ECON
Amendment 462 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders on the platform, it shall ensure that traders can only use its services to promote messages on or to offer products or, services or content to consumers located in the Union if, prior to the use of its services, the online platform has obtaintrader has provided the following information to the online platform:
2021/09/10
Committee: ECON
Amendment 464 #
Proposal for a regulation
Article 22 – paragraph 1 – point b
(b) a passport or a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council50 ; _________________ 50 Regulation (EU) No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/EC
2021/09/10
Committee: ECON
Amendment 466 #
Proposal for a regulation
Article 22 – paragraph 1 – point d
(d) to the extent the contract relates to products that are subject to the Union Regulations listed in Article 4(5) of Regulation (EU) 2019/1020 of the European Parliament and the Council, the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and established in the Union, referred to in Article 4(1) of Regulation (EU) 2019/1020 of the European Parliament and the Council51 or any relevant act of Union law; _________________ 51Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
2021/09/10
Committee: ECON
Amendment 470 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. Provided that the online platform has made reasonable efforts to assess the information in points (a), (d) and (e), online platform shall not be held liable for information provided by the trader that ends up being inaccurate.
2021/09/10
Committee: ECON
Amendment 471 #
Proposal for a regulation
Article 22 – paragraph 3 – introductory part
3. Where the online platform obtains indications, through its reasonable efforts under paragraph 2 or through Member States' consumer authorities, that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/09/10
Committee: ECON
Amendment 515 #
Proposal for a regulation
Article 27 – paragraph 2 – subparagraph 1 (new)
(c) measures taken by the Digital Service Coordinators, the Board and the Commission to ensure that highly sensitive information and business secrets are kept confidential.
2021/09/10
Committee: ECON
Amendment 521 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned and have not provided any other service to the platform in the previous 12 months;
2021/09/10
Committee: ECON
Amendment 523 #
Proposal for a regulation
Article 28 – paragraph 2 – point c
(c) have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.;
2021/09/10
Committee: ECON
Amendment 524 #
Proposal for a regulation
Article 28 – paragraph 2 – subparagraph 1 (new)
(d) have not provided an audit to the same very large online platform for more than three consecutive years.
2021/09/10
Committee: ECON
Amendment 525 #
Proposal for a regulation
Article 28 – paragraph 3 – point f
(f) where the audit opinion is not posiegative, operational recommendations on specific measures to achieve compliance. and risk- based remediation timelines with a focus on rectifying issues that have the potential to cause most harm to users of the service as a priority;
2021/09/10
Committee: ECON
Amendment 526 #
Proposal for a regulation
Article 28 – paragraph 3 – subparagraph 1 (new)
(g) where the organisations that perform the audits do not have enough information to conclude an opinion due to the novelty of the issues audited, a disclaimer shall be given.
2021/09/10
Committee: ECON
Amendment 543 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. The delegated acts should also lay out the technical conditions needed to ensure confidentiality and security of information by the vetted researchers once they acquire access to the data, including guidelines for academics who wish to publish findings based on the confidential data acquired. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/09/10
Committee: ECON
Amendment 560 #
Proposal for a regulation
Article 39 – paragraph 1 – subparagraph 1 (new)
Member States shall designate the status of Digital Services Coordinator based on the following criteria: (a) the authority has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; (b) it represents collective interests and is independent from any online platform; (c) it has the capacity to carry out its activities in a timely, diligent and objective manner.
2021/09/10
Committee: ECON
Amendment 574 #
Proposal for a regulation
Article 44 – paragraph 1
1. Digital Services Coordinators shall draw up an annual reports on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission and to the Board.
2021/09/10
Committee: ECON
Amendment 575 #
Proposal for a regulation
Article 44 – paragraph 2 – point b a (new)
(ba) measures taken by the Digital Service Coordinators to ensure that highly sensitive information and business secrets are kept confidential;
2021/09/10
Committee: ECON
Amendment 576 #
Proposal for a regulation
Article 44 – paragraph 2 – point b b (new)
(bb) an assessment of the interpretation of the Country of Origin principle in the supervisory and enforcement activities of the Digital Services Coordinators, especially in regards to Article 45 of this Regulation.
2021/09/10
Committee: ECON
Amendment 607 #
Proposal for a regulation
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms, without prejudice to Directive (EU) 2016/943 on trade secrets.
2021/09/10
Committee: ECON
Amendment 648 #
Proposal for a regulation
Article 69 – paragraph 2
2. The delegation of power referred to in Articles 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation].
2021/09/10
Committee: ECON
Amendment 649 #
Proposal for a regulation
Article 69 – paragraph 3
3. The delegation of power referred to in Articles 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
2021/09/10
Committee: ECON
Amendment 650 #
Proposal for a regulation
Article 69 – paragraph 5
5. A delegated act adopted pursuant to Articles 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.
2021/09/10
Committee: ECON
Amendment 655 #
Proposal for a regulation
Article 74 – paragraph 2 – introductory part
2. It shall apply from [date - threnine months after its entry into force].
2021/09/10
Committee: ECON