BETA

41 Amendments of Ivars IJABS related to 2020/0361(COD)

Amendment 185 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors, women and vulnerable users, such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/20
Committee: JURI
Amendment 205 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
2021/07/20
Committee: JURI
Amendment 225 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concepts of “illegal content” and “illegal goods” should underpin the general idea that what is illegal offline should also be illegal online. The concepts should be defined broadly and alsto covers information relating to illegal content, products, services and activities. In particular, thate concepts should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that is not in compliance with Union law since it relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involvingsale of products or the provision of services in infringements of consumer protection law, the non-authorised use of copyright protected material. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/08
Committee: IMCO
Amendment 301 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
2021/06/24
Committee: ITRE
Amendment 314 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1 a. The information provided shall be broken down per Member State in which services are offered and in the Union as a whole.
2021/06/24
Committee: ITRE
Amendment 316 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraphs 1 and 1a shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/06/24
Committee: ITRE
Amendment 342 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, health and trust of the recipients of the service, including minors, women and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empowerprovide recourse to recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 412 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
4 a. Member States can acknowledge trusted flaggers recognized in another Member State as a trusted flagger on their own territory. Trusted flaggers can be awarded the status of European trusted flagger;
2021/06/24
Committee: ITRE
Amendment 415 #
Proposal for a regulation
Article 19 – paragraph 7
7. The Commission, after consulting the Board, mayshall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 5 and 6.
2021/06/24
Committee: ITRE
Amendment 439 #
Proposal for a regulation
Article 22 – title
22 Traceability of traders and online advertisers
2021/06/24
Committee: ITRE
Amendment 444 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders or sells online advertisements, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
2021/06/24
Committee: ITRE
Amendment 478 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service through the submission of abusive notices, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 482 #
Proposal for a regulation
Article 23 – paragraph 4
4. The Commission mayshall adopt implementing acts to establish a set of Key Performance Indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
2021/06/24
Committee: ITRE
Amendment 498 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request information where their advertisements have been placed; (b) can request information on which broker treated their data; (c) can indicate on which specific websites their ads cannot be placed. In case of non-compliance with this provision, advertisers should have an option to judicial redress.
2021/06/24
Committee: ITRE
Amendment 525 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. The risk assessment shall be broken down per Member State in which services are offered and in the Union as a whole. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/06/24
Committee: ITRE
Amendment 540 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use, deep fakes or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/06/24
Committee: ITRE
Amendment 563 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which. The reports of the Board shall be broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. The reports shall include the following:
2021/06/24
Committee: ITRE
Amendment 617 #
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
(b a) the natural or legal person who paid for the advertisement;
2021/06/24
Committee: ITRE
Amendment 623 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2 a. The Board shall, together with trusted flaggers and vetted researchers, publish guidelines on the way add libraries should be organized.
2021/06/24
Committee: ITRE
Amendment 623 #
Proposal for a regulation
Article 1 – paragraph 5 – introductory part
5. This Rregulation and its exception of liability of digital operators is without any prejudice to and does not hinder future regulation of in the rules laid down by the following:
2021/07/08
Committee: IMCO
Amendment 625 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2 b. Very large online platforms shall label inauthentic video’s (‘deep fakes’) as inauthentic in a way that is clearly visible for the internet user.
2021/06/24
Committee: ITRE
Amendment 642 #
Proposal for a regulation
Article 33 – paragraph 2 b (new)
2 b. The reports shall be published in the official languages of the Member States of the Union.
2021/06/24
Committee: ITRE
Amendment 766 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.deleted
2021/07/08
Committee: IMCO
Amendment 776 #
Proposal for a regulation
Article 5 a (new)
Article 5a Liability of online platform allowing consumers to conclude distance contracts with traders 1. In addition to Article 5(1), an online platform allowing consumers to conclude distance contracts with traders shall not benefit from the liability exemption provided for in Article 5 if it does not comply with the obligations referred to in Articles 11, 13b, 13c, 14, 22 or 24a. Such liability exemption shall also not benefit the online platform if it does not comply with specific information requirements for contracts concluded on online marketplaces, in line with Article 6a(1) of the Directive 2011/83/EU of the European Parliament and of the Council. 2. The liability exemption in Article 5(1) and in paragraph 1 of this Article shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead a consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its control, authority or decisive influence. 3. For the assessment of whether the online platform has that control or authority or decisive influence over the trader, relevant criteria shall include: (a) the trader-consumer contract is concluded exclusively through facilities provided on the platform; (b) the online platform operator withholds the identity of the trader or contact details until after the conclusion of the trader-consumer contract; (c) the online platform operator exclusively uses payment systems which enable the platform operator to withhold payments made by the consumer to the trader; (d) the terms of the trader-consumer contract are essentially determined by the online platform operator; (e) the price to be paid by the consumer is set by the online platform operator; (f) the online platform is marketing the product or service in its own name rather than using the name of the trader who will supply it; 4. The liability exemption in Article 5(1) of this Regulation shall not apply in case an online platform allows consumers to conclude distance contracts with traders from third countries when: (a) there is no economic operator inside the Union liable for the product safety or when the economic operator is available but does not respond to claims; and (b) the product does not comply with the relevant product safety and product compliance Union or national law; 5. Consumers concluding distance contracts with traders shall be entitled to seek redress from the online platform for infringement of the obligations laid down in this Regulation and in accordance with relevant Union and national law, i.e. liability for damages that the consumer would be entitled to according to EU rules on product liability (Council Directive 85/374/EEC), if the product is defective and sold in the EU. 6. The online platform shall be entitled to seek redress from the trader who has used its services in case of a failure by that trader to comply with his obligations under this Regulation regarding the online platform or regarding the consumers.
2021/07/08
Committee: IMCO
Amendment 840 #
Proposal for a regulation
Article 24 – paragraph 1 e (new)
Online platforms shall not be allowed to resort to cross-device and cross-service combination of data processed inside or outside the platform.
2021/07/19
Committee: JURI
Amendment 856 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/19
Committee: JURI
Amendment 864 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/19
Committee: JURI
Amendment 882 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/19
Committee: JURI
Amendment 898 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in article 27.1, it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in article 28.3.
2021/07/19
Committee: JURI
Amendment 908 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, mayand following public consultations shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
2021/07/19
Committee: JURI
Amendment 1112 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/19
Committee: JURI
Amendment 1124 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/19
Committee: JURI
Amendment 1128 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2. Where then Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/19
Committee: JURI
Amendment 1550 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1563 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1606 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1626 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in Article 28(3).
2021/07/08
Committee: IMCO
Amendment 1658 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27
2021/07/08
Committee: IMCO
Amendment 2099 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/08
Committee: IMCO
Amendment 2120 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/08
Committee: IMCO
Amendment 2130 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/08
Committee: IMCO