BETA

42 Amendments of Samira RAFAELA related to 2020/0361(COD)

Amendment 29 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to gender equality and non- discrimination. Children, especially girls, have specific rights enshrined in Article 24 of the Charter and in the United Nations Convention on the Rights of the Child. As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
2021/07/15
Committee: FEMM
Amendment 49 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as health, including mental health, the safety and trust of the recipients of the service, including minors and vulnerable users, women, LGBTIQ+ people and vulnerable users such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. The World Health Organisation defines ‘health’ as a state of complete physical, mental and social well- being and not merely the absence of disease or infirmity. This definition supports the fact that the development of new technologies might bring new health risks to users, in particular for children and women, such as psychological risk, development risks, mental risks, depression, loss of sleep, or altered brain function.
2021/07/15
Committee: FEMM
Amendment 59 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
2021/07/15
Committee: FEMM
Amendment 69 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/15
Committee: FEMM
Amendment 86 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including gender equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/07/15
Committee: FEMM
Amendment 92 #
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
(d a) ‘child’ means any natural person under the age of 18;
2021/07/15
Committee: FEMM
Amendment 100 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1 a. Providers of intermediary services shall ensure their terms and conditions are age-appropriate, promote gender equality and the rights of LGBTIQ+ people and meet the highest European or International standards, pursuant to Article 34.
2021/07/15
Committee: FEMM
Amendment 105 #
Proposal for a regulation
Article 12 a (new)
Article 12 a Child impact assessment 1. All providers shall assess whether their services are accessed by, likely to be accessed by, or impact children, especially girls. Providers of services likely to impact children, especially girls, shall identify, analyse and assess, during the design and development of new services, on an ongoing basis and at least once a year, any systemic risks stemming from the functioning and use of their services in the Union for children, especially girls. These risk impact assessments shall be specific to their services, meet the highest European or International standards detailed in Article 34, and shall consider all known content, contact, conduct or commercial risks included in the contract. Assessments shall also include the following systemic risks: (a) the dissemination of illegal content or behaviour enabled, manifested on or as a result of their services; (b) any negative effects for the exercise of the rights of the child, as enshrined in Article 24 of the Charter and in the UN Convention on the Rights of the Child, and detailed in the United Nations Committee on the Rights of the Child General comment No.25 as regards the digital environment; (c) any negative effects on the the right to gender equality, as enshrined in Article 23 of the Charter, particularly the right to live free from violence as envisaged by the Council of Europe Convention on preventing and combating violence against women and girls (Istanbul Convention); (d) any negative effects on the right to non-discrimination, as enshrined in Article 21 of the Charter; (e) any intended or unintended consequences resulting from the operation or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on children's rights, especially of girls. 2. When conducting such impact assessments, providers of intermediary services likely to impact children, especially girls, shall take into account, in particular, how their terms and conditions, content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions or with the rights of the child, especially of girls.
2021/07/15
Committee: FEMM
Amendment 106 #
Article 12 b Mitigation of risks to children, especially girls Providers of intermediary services likely to impact children, especially girls, shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 12 a. Such measures shall include, where applicable: (a) implementing mitigation measures identified in Article 27 with regard for children’s best interests; (b) adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; (c) implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; (d) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child and gender equality; (e) ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; (f) preventing profiling of children, including for commercial purposes like targeted advertising; (g) ensuring published terms are age appropriate and uphold children’s rights and gender equality; (h) providing child-friendly and inclusive mechanisms for remedy and redress, including easy access to expert advice and support.
2021/07/15
Committee: FEMM
Amendment 110 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, gender and (if children) the age of complainants, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/07/15
Committee: FEMM
Amendment 111 #
Proposal for a regulation
Article 13 – paragraph 1 a (new)
1 a. Providers of intermediary services that impact children, especially girls, shall publish, at least once a year: (a) child impact assessments to identify known harms, unintended consequences and emerging risks; these impact assessments shall comply with the standards outlined in Article 34; (b) clear, easily comprehensible and detailed reports outlining the gender equality and child risk mitigation measures undertaken, their efficacy and any outstanding actions required; these reports shall comply with the standards outlined in Article 34, including as regards age assurance and age verification, in line with a child-centred design that equally promotes gender equality.
2021/07/15
Committee: FEMM
Amendment 124 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling and redress systems are easy to access, and user-friendly, including for children, especially girls and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.
2021/07/15
Committee: FEMM
Amendment 128 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/15
Committee: FEMM
Amendment 130 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/15
Committee: FEMM
Amendment 144 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/15
Committee: FEMM
Amendment 149 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1 a. Where a very large online platform decides not to put in place any of the mitigation measures listed in article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report referred to in Article 28(3).
2021/07/15
Committee: FEMM
Amendment 151 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27;
2021/07/15
Committee: FEMM
Amendment 158 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
2 a. The Commission shall support and promote the development and implementation of industry standards set by relevant European and international standardisation bodies for the protection and promotion of the rights of the child and the right to gender equality, observance of which, once adopted, will be mandatory, at least for the following: (a) age assurance and age verification pursuant to Article 13; (b) child impact assessments pursuant to Article 13; (c) age-appropriate terms and conditions that equally promote gender equality pursuant to Article 12; (d) child-centred design that equally promotes gender equality and pursuant to Article 13.
2021/07/15
Committee: FEMM
Amendment 167 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/15
Committee: FEMM
Amendment 168 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/15
Committee: FEMM
Amendment 169 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2. Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/15
Committee: FEMM
Amendment 185 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors, women and vulnerable users, such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/20
Committee: JURI
Amendment 205 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
2021/07/20
Committee: JURI
Amendment 342 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, health and trust of the recipients of the service, including minors, women and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empowerprovide recourse to recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 478 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service through the submission of abusive notices, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 840 #
Proposal for a regulation
Article 24 – paragraph 1 e (new)
Online platforms shall not be allowed to resort to cross-device and cross-service combination of data processed inside or outside the platform.
2021/07/19
Committee: JURI
Amendment 856 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/19
Committee: JURI
Amendment 864 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/19
Committee: JURI
Amendment 882 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/19
Committee: JURI
Amendment 898 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in article 27.1, it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in article 28.3.
2021/07/19
Committee: JURI
Amendment 908 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, mayand following public consultations shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
2021/07/19
Committee: JURI
Amendment 1112 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/19
Committee: JURI
Amendment 1124 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/19
Committee: JURI
Amendment 1128 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2. Where then Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/19
Committee: JURI
Amendment 1550 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1563 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of any of the fundamental rights listed in the Charter, in particular on the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1606 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1626 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. Where a very large online platform decides not to put in place any of the mitigating measures listed in Article 27(1), it shall provide a written explanation that describes the reasons why those measures were not put in place, which shall be provided to the independent auditors in order to prepare the audit report in Article 28(3).
2021/07/08
Committee: IMCO
Amendment 1658 #
Proposal for a regulation
Article 28 – paragraph 1 – point a
(a) the obligations set out in Chapter III; in particular the quality of the identification, analysis and assessment of the risks referred to in Article 26, and the necessity, proportionality and effectiveness of the risk mitigation measures referred to in Article 27
2021/07/08
Committee: IMCO
Amendment 2099 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 2
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, mayshall, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time periodout undue delay and in any event within two months.
2021/07/08
Committee: IMCO
Amendment 2120 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, mayshall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/07/08
Committee: IMCO
Amendment 2130 #
Proposal for a regulation
Article 51 – paragraph 2 – subparagraph 1
Wheren the Commission decides to initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned.
2021/07/08
Committee: IMCO