BETA

Activities of Josianne CUTAJAR related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (16)

Amendment 129 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. Children have specific rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
2021/06/10
Committee: LIBE
Amendment 187 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. Children have specific rights enshrined in Article 24 of the Charter of Fundamental Rights of the European Union and in the United Nations Convention on the Rights of the Child. As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
2021/07/08
Committee: IMCO
Amendment 345 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as health – including mental health, the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 412 #
Proposal for a regulation
Article 12 a (new)
Article 12 a Child impact assesment 1. All providers must assess whether their services are accessed by, likely to be accessed by or impact on children, defined as persons under the age of 18. Providers of services likely to be accessed by or impact on children shall identify, analyse and assess, during the design and development of new services and at least once a year thereafter, any systemic risks stemming from the functioning and use made of their services in the Union by children. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments should also include the following systemic risks: a. the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; b. any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No. 25 as regards the digital environment; c. any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection or rights of children; 2. When conducting child impact assessments, providers of intermediary services likely to impact children shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child.
2021/06/10
Committee: LIBE
Amendment 414 #
Proposal for a regulation
Article 12 b (new)
Article 12 b Mitigation of risks to children Providers of intermediary services likely to impact children shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 13 (12 a new). Such measures shall include, where applicable: a. implementing mitigation measures identified in Article 27 with regard for children’s best interests; b. adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; c. implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; d. adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child; e. ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; f. preventing profiling, including for commercial purposes like targeted advertising; g. ensuring published terms are age appropriate and uphold children’s rights; h. providing child-friendly mechanisms for remedy and redress, including easy access to expert advice and support;
2021/06/10
Committee: LIBE
Amendment 422 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the age of complainants (if minors), the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/06/10
Committee: LIBE
Amendment 427 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
2 a. Providers of intermediary services that impact on children shall publish, at least once a year: a. child impact assessments to identify known harms, unintended consequences and emerging risk, pursuant to Article 13 (Art. 12 a new).The child impact assessments must comply with the standards outlined in Article 34; b. clear, easily comprehensible and detailed reports outlining the child risk mitigation measures undertaken pursuant to Article 14, their efficacy and any outstanding actions required. These reports must comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child-centred design. The content of these reports must be verifiable by independent audit; data sets and source code must be made available at the request of the regulator.
2021/06/10
Committee: LIBE
Amendment 666 #
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
(da) ‘child’ means any natural person under the age of 18;
2021/07/08
Committee: IMCO
Amendment 772 #
Proposal for a regulation
Article 34 – paragraph 1 a (new)
1 a. 2 (new).The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child, observance of which, once adopted, will be mandatory, at least for the following: a. age assurance and age verification pursuant to Articles 12 a (new) and 12 b (new) and 13; b. child impact assessments pursuant to Articles 12 a (new) and 13; c. age-appropriate terms and conditions pursuant to Article 12; d. child-centred design pursuant to Articles 12 b (new) and 13.
2021/06/10
Committee: LIBE
Amendment 937 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall ensure their terms and conditions are age-appropriate and meet the highest European or International standards, pursuant to Article 34.
2021/07/08
Committee: IMCO
Amendment 968 #
Proposal for a regulation
Article 12 a (new)
Article 12a Child impact assessment 1. All providers must assess whether their services are accessed by, likely to be accessed by or impact on children. Providers of services likely to be accessed by or impact on children shall identify, analyse and assess, during the design and development of new services, on an ongoing basis and at least once a year thereafter, any systemic risks stemming from the functioning and use made of their services in the Union by children. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments should also include the following systemic risks: (a) the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; (b) any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No.25 as regards the digital environment; (c) any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection or rights of children; 2. When conducting child impact assessments, providers of intermediary services likely to impact children shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child.
2021/07/08
Committee: IMCO
Amendment 973 #
Proposal for a regulation
Article 12 b (new)
Article 12b Mitigation of risks to children Providers of intermediary services likely to impact children shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 13 (12 a new). Such measures shall include, where applicable: (a) implementing mitigation measures identified in Article 27 with regard for children’s best interests; (b) adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; (c) implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; (d) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child; (e) ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; (f) preventing profiling, including for commercial purposes like targeted advertising; (g) ensuring published terms are age appropriate and uphold children’s rights; (h) providing child-friendly mechanisms for remedy and redress, including easy access to expert advice and support;
2021/07/08
Committee: IMCO
Amendment 992 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the age of complainants (if children), the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/07/08
Committee: IMCO
Amendment 997 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1a. Providers of intermediary services that impact on children shall publish, at least once a year: (a) child impact assessments to identify known harms, unintended consequences and emerging risk. The child impact assessments must comply with the standards outlined in Article 34; (b) clear, easily comprehensible and detailed reports outlining the child risk mitigation measures undertaken, their efficacy and any outstanding actions required. These reports must comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child- centred design.
2021/07/08
Committee: IMCO
Amendment 1178 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling and redress systems are easy to access, and user-friendly, including for children, and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.
2021/07/08
Committee: IMCO
Amendment 1509 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
2. The profiling of children for commercial purposes, including targeted or pernolised advertising, is prohibited in compliance with the industry-standards laid down in Article 34 and Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO