BETA

Activities of Bart GROOTHUIS related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (51)

Amendment 122 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law, national law, or international law and the interests of international comity.
2021/06/23
Committee: ITRE
Amendment 123 #
Proposal for a regulation
Recital 33
(33) Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information under either Union or national law, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders.
2021/06/23
Committee: ITRE
Amendment 158 #
Proposal for a regulation
Recital 49
(49) In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with tradermarketplaces should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platformprovider of the online marketplace, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation.
2021/06/23
Committee: ITRE
Amendment 160 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platformproviders of online marketplaces covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platformproviders of online marketplaces covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platformproviders, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platformProviders of online marketplaces should also design and organise their online interface in a user- friendly way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . The online interface shall allow traders to provide the information allowing for the unequivocal identification of the product or the service, including labelling requirements, in compliance with legislation on product safety and product compliance. _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/06/23
Committee: ITRE
Amendment 301 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
2021/06/24
Committee: ITRE
Amendment 314 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1 a. The information provided shall be broken down per Member State in which services are offered and in the Union as a whole.
2021/06/24
Committee: ITRE
Amendment 316 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraphs 1 and 1a shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/06/24
Committee: ITRE
Amendment 367 #
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises, within the meaning of the Annex to Recommendation 2003/361/EC. The Commission should assess the need to exclude micro and small entreprises that reach a large audience, based a number of average monthly active recipients of the service in the Union, calculated in accordance with the methodology set out in the delegated acts referred to in article 25, paragraph 3.
2021/06/24
Committee: ITRE
Amendment 412 #
Proposal for a regulation
Article 19 – paragraph 4 a (new)
4 a. Member States can acknowledge trusted flaggers recognized in another Member State as a trusted flagger on their own territory. Trusted flaggers can be awarded the status of European trusted flagger;
2021/06/24
Committee: ITRE
Amendment 415 #
Proposal for a regulation
Article 19 – paragraph 7
7. The Commission, after consulting the Board, mayshall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 5 and 6.
2021/06/24
Committee: ITRE
Amendment 439 #
Proposal for a regulation
Article 22 – title
22 Traceability of traders and online advertisers
2021/06/24
Committee: ITRE
Amendment 444 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders or sells online advertisements, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
2021/06/24
Committee: ITRE
Amendment 445 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, itProviders of online marketplaces shall ensure that traders can only use itstheir services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of itstheir services, the online platformmarketplaces hasve obtained the following information:
2021/06/24
Committee: ITRE
Amendment 459 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platformprovider of the online marketplace shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources.
2021/06/24
Committee: ITRE
Amendment 464 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
3. Where the online platformprovider of the online marketplace obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platformmarketplace shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/06/24
Committee: ITRE
Amendment 467 #
Proposal for a regulation
Article 22 – paragraph 4
4. The online platformprovider of the online marketplace shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information.
2021/06/24
Committee: ITRE
Amendment 470 #
Proposal for a regulation
Article 22 – paragraph 5
5. Without prejudice to paragraph 2, the platformrovider of the online marketplace shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation.
2021/06/24
Committee: ITRE
Amendment 471 #
Proposal for a regulation
Article 22 – paragraph 6
6. The online platformprovider of the online marketplace shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner.
2021/06/24
Committee: ITRE
Amendment 473 #
Proposal for a regulation
Article 22 – paragraph 7
7. The online platformprovider of the online marketplace shall design and organise its online interface in a fair and user-friendly way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law.
2021/06/24
Committee: ITRE
Amendment 475 #
Proposal for a regulation
Article 22 – paragraph 7 a (new)
7 a. The online interface shall allow traders to provide the information allowing for the unequivocal identification of the product or the service, and, where applicable, the information concerning the labelling, including CE marking, which are mandatory under applicable legislation on product safety and product compliance.
2021/06/24
Committee: ITRE
Amendment 482 #
Proposal for a regulation
Article 23 – paragraph 4
4. The Commission mayshall adopt implementing acts to establish a set of Key Performance Indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.
2021/06/24
Committee: ITRE
Amendment 498 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request information where their advertisements have been placed; (b) can request information on which broker treated their data; (c) can indicate on which specific websites their ads cannot be placed. In case of non-compliance with this provision, advertisers should have an option to judicial redress.
2021/06/24
Committee: ITRE
Amendment 525 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. The risk assessment shall be broken down per Member State in which services are offered and in the Union as a whole. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/06/24
Committee: ITRE
Amendment 540 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use, deep fakes or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/06/24
Committee: ITRE
Amendment 563 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which. The reports of the Board shall be broken down per Member State in which the systemic risks occur and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. The reports shall include the following:
2021/06/24
Committee: ITRE
Amendment 610 #
2 a. Online platforms shall ensure that their online interface is designed in such a way that it does not risk misleading or manipulating the recipients of the service.
2021/06/24
Committee: ITRE
Amendment 617 #
Proposal for a regulation
Article 30 – paragraph 2 – point b a (new)
(b a) the natural or legal person who paid for the advertisement;
2021/06/24
Committee: ITRE
Amendment 623 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2 a. The Board shall, together with trusted flaggers and vetted researchers, publish guidelines on the way add libraries should be organized.
2021/06/24
Committee: ITRE
Amendment 624 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2 b. Very large online platforms that display advertising on their online interfaces shall conduct at their own expense, and upon request of advertisers , independent audits performed by organisations complying with the criteria set out in Article 28(2). Such audits shall be based on fair and proportionate conditions agreed between platforms and advertisers, shall be conducted with a reasonable frequency and shall entail: (a) conducting quantitative and qualitative assessment of cases where advertising is associated with illegal content or with content incompatible with platforms’ terms and conditions; (b) monitoring for and detecting of fraudulent use of their services to fund illegal activities; (c) assessing the performance of their tools in terms of brand safety. The audit report shall include an opinion on the performance of platforms’ tools in terms of brand safety. Where the audit opinion is not positive, the report shall make operational recommendations to the platforms on specific measures in order to achieve compliance. The platforms shall make available to advertisers, upon request, the results of such audit.
2021/06/24
Committee: ITRE
Amendment 625 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2 b. Very large online platforms shall label inauthentic video’s (‘deep fakes’) as inauthentic in a way that is clearly visible for the internet user.
2021/06/24
Committee: ITRE
Amendment 641 #
Proposal for a regulation
Article 33 – paragraph 2 a (new)
2 a. The reports shall include content moderation broken down per Member State in which the services are offered and in the Union as a whole.
2021/06/24
Committee: ITRE
Amendment 642 #
Proposal for a regulation
Article 33 – paragraph 2 b (new)
2 b. The reports shall be published in the official languages of the Member States of the Union.
2021/06/24
Committee: ITRE
Amendment 966 #
Proposal for a regulation
Article 12 a (new)
Article 12a General Risk Assessment and Mitigation Measures 1. Providers of intermediary services shall identify, analyse and assess, at least once and at each significant revision of a service thereafter, the potential misuse or other risks stemming from the functioning and use made of their services in the Union. Such a general risk assessment shall be specific to their services and shall include at least risks related to the dissemination of illegal content through their services and any contents that might have a negative effect on potential recipients of the service, especially minors and gender equality. 2. Providers of intermediary services which identify potential risks shall, wherever possible, attempt to put in place reasonable, proportionate and effective mitigation measures in line with their terms and conditions. 3. Where the identified risk relates to minors, without regard to if the child is acting with respect to the terms and conditions, mitigation measures shall include, taking into account the industry standards referred to in Article 34, where needed and applicable: (a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure those prioritise the best interests of the child; (b) adapting or removing system design features that expose or promote to children to content, contact, conduct and contract risks; (c) ensuring the highest levels of privacy, safety, and security by design and default for children including any profiling or use of data for commercial purposes; (d) if a service is targeted at children, provide child-friendly mechanisms for remedy and redress, including easy access to expert advice and support. 4. Providers of intermediary services shall, upon request, explain to the Digital Services Coordinator of the Member State of establishment, how it undertook this risk assessment and what voluntary mitigation measures it undertook.
2021/07/08
Committee: IMCO
Amendment 983 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; Providers of intermediary services may add additional information as to the reasons for the average time for taking the action.
2021/07/08
Committee: IMCO
Amendment 1285 #
Proposal for a regulation
Article 19 – paragraph 2 – point c a (new)
(ca) it has a transparent funding structure, including publishing the sources and amounts of all revenue annually
2021/07/08
Committee: IMCO
Amendment 1286 #
Proposal for a regulation
Article 19 – paragraph 2 – point c b (new)
(cb) it is not already a trusted flagger in another Member State.
2021/07/08
Committee: IMCO
Amendment 1287 #
Proposal for a regulation
Article 19 – paragraph 2 – point c c (new)
(cc) it publishes, at least once a year, clear, easily comprehensible and detailed reports on any notices submitted in accordance with Article 14 during the relevant period. The report shall list notices categorised by the identity of the hosting service provider, the type of alleged illegal or terms and conditions violating content concerned, and what action was taken by the provider. In addition, the report shall identify relationships between the trusted flagger and any online platform, law enforcement, or other government or relevant commercial entity, and explain the means by which the trusted flagger maintains its independence.
2021/07/08
Committee: IMCO
Amendment 1576 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service and amplification of content that is in breach of their terms and conditions, including by means of inauthentic use, or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/08
Committee: IMCO
Amendment 1602 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measureseasures to mitigate the probability and severity of any, tailored to address the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1664 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraph 1 shall be performed by organisations which have been selected by the Commission and:
2021/07/08
Committee: IMCO
Amendment 1711 #
Proposal for a regulation
Article 30 – title
Additional online advertising transparency and protection
2021/07/08
Committee: IMCO
Amendment 1725 #
Proposal for a regulation
Article 30 – paragraph 2 – point c a (new)
(ca) the natural or legal person or group who paid for the advertisement;
2021/07/08
Committee: IMCO
Amendment 1738 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. Very large online platforms shall be prohibited from profiling children under the age of 16 for commercial practices, including personalized advertising, in compliance with industry- standards laid down in Article 34 and Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1740 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. The Board shall, after consulting with trusted flaggers and vetted researchers, publish guidelines on the structure and organisation of repositories created pursuant to paragraph 1.
2021/07/08
Committee: IMCO
Amendment 1743 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2b. Where a very large online platform becomes aware that a piece of content is a deep fake, the provider shall label the content in a way that informs that the content is inauthentic and that is clearly visible for the recipient of the services.
2021/07/08
Committee: IMCO
Amendment 1749 #
Proposal for a regulation
Article 30 – paragraph 2 e (new)
2e. Very large online platforms that display advertising on their online interfaces shall ensure that advertisers: (a) can request and obtain information on where their advertisements have been placed; (b) can request and obtain information on which broker treated their data;
2021/07/08
Committee: IMCO
Amendment 1761 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, researchers shall: (1) be affiliated with academic institutions, be independent from commercial interests, within the Union and the institutions certifies that the researcher is a researcher in good standing (2) be independent from commercial interests, including any very large online platforms (3) be independent from any government, administrative or other state bodies, outside the academic institution of affiliation if public, (4) have undergone an independent background and security investigation, subject to the national legislation of the Member State of residence. (5) be a resident of the Union; (6) have proven records of expertise in the fields related to the risks investigated or related research methodologies, and (7) shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
2021/07/08
Committee: IMCO
Amendment 1835 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child, observance of which, once adopted will be mandatory for very large online platforms, at least for the following: (a) age assurance and age verification; (b) child impact assessments; (c) child-centred and age-appropriate design; (d) child-centred and age-appropriate terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1874 #
Proposal for a regulation
Article 35 – paragraph 5
5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic and repetitive failure to comply with the Codes of Conduct, the Board shall as a measure of last resort take a decision to temporary suspend or definitely exclude platforms that do not meet their commitments as a signatory to the Codes of Conduct.
2021/07/08
Committee: IMCO
Amendment 1882 #
Proposal for a regulation
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency infor all actors in the online advertising value chain, beyond the requirements of Articles 24 and 30.
2021/07/08
Committee: IMCO
Amendment 1893 #
Proposal for a regulation
Article 36 a (new)
Article 36a Codes of conduct for the protection of minors 1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between online platforms and other relevant services providers and organisations representing minors, parents and civil society organisations or relevant authorities to further contribute to the protection of minors on online. 2. The Commission shall aim to ensure that the codes of conduct pursue an effective protection of minors online, which respects their right as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment. The Commission shall aim to ensure that the codes of conduct address at least: (a) age verification and age assurance models, taking into account the industry standards referred to in article 34. (b) child-centred and age-appropriate design, taking into account the industry standards referred to in Article 34. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of the Regulation and their application no later than six months after that date.
2021/07/08
Committee: IMCO