BETA

27 Amendments of Petras AUŠTREVIČIUS related to 2020/0361(COD)

Amendment 301 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
2021/06/24
Committee: ITRE
Amendment 314 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1 a. The information provided shall be broken down per Member State in which services are offered and in the Union as a whole.
2021/06/24
Committee: ITRE
Amendment 316 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraphs 1 and 1a shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/06/24
Committee: ITRE
Amendment 342 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, health and trust of the recipients of the service, including minors, women and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empowerprovide recourse to recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 412 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
4 a. Member States can acknowledge trusted flaggers recognized in another Member State as a trusted flagger on their own territory. Trusted flaggers can be awarded the status of European trusted flagger;
2021/06/24
Committee: ITRE
Amendment 415 #
Proposal for a regulation
Article 19 – paragraph 7
7. The Commission, after consulting the Board, mayshall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 5 and 6.
2021/06/24
Committee: ITRE
Amendment 439 #
Proposal for a regulation
Article 22 – title
22 Traceability of traders and online advertisers
2021/06/24
Committee: ITRE
Amendment 444 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders or sells online advertisements, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
2021/06/24
Committee: ITRE
Amendment 478 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service through the submission of abusive notices, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 482 #
Proposal for a regulation
Article 23 – paragraph 4
4. The Commission mayshall adopt implementing acts to establish a set of Key Performance Indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
2021/06/24
Committee: ITRE
Amendment 498 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request information where their advertisements have been placed; (b) can request information on which broker treated their data; (c) can indicate on which specific websites their ads cannot be placed. In case of non-compliance with this provision, advertisers should have an option to judicial redress.
2021/06/24
Committee: ITRE
Amendment 525 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. The risk assessment shall be broken down per Member State in which services are offered and in the Union as a whole. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/06/24
Committee: ITRE
Amendment 540 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use, deep fakes or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/06/24
Committee: ITRE
Amendment 563 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which. The reports of the Board shall be broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. The reports shall include the following:
2021/06/24
Committee: ITRE
Amendment 617 #
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
(b a) the natural or legal person who paid for the advertisement;
2021/06/24
Committee: ITRE
Amendment 623 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2 a. The Board shall, together with trusted flaggers and vetted researchers, publish guidelines on the way add libraries should be organized.
2021/06/24
Committee: ITRE
Amendment 625 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2 b. Very large online platforms shall label inauthentic video’s (‘deep fakes’) as inauthentic in a way that is clearly visible for the internet user.
2021/06/24
Committee: ITRE
Amendment 641 #
Proposal for a regulation
Article 33 – paragraph 2 a (new)
2 a. The reports shall include content moderation broken down per Member State in which the services are offered and in the Union as a whole.
2021/06/24
Committee: ITRE
Amendment 642 #
Proposal for a regulation
Article 33 – paragraph 2 b (new)
2 b. The reports shall be published in the official languages of the Member States of the Union.
2021/06/24
Committee: ITRE
Amendment 1550 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1563 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1606 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1626 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in Article 28(3).
2021/07/08
Committee: IMCO
Amendment 1658 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27
2021/07/08
Committee: IMCO
Amendment 2099 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/08
Committee: IMCO
Amendment 2120 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/08
Committee: IMCO
Amendment 2130 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/08
Committee: IMCO