BETA

24 Amendments of Christine SCHNEIDER related to 2020/0361(COD)

Amendment 28 #
Proposal for a regulation
Recital 2
(2) Up till now, politics has relied on voluntary cooperation with a view to address these risks and challenges. Since this has proved insufficient and there has been a lack of harmonised rules at Union level, Member States arehave been increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross- border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice.
2021/07/15
Committee: FEMM
Amendment 30 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. the gender equality principle and non- discrimination. In order to exercise these rights, the online world needs to be a safe space, especially for women and girls, where everybody can move freely. Therefore, measures to protect from, and prevent, phenomena such as online violence, cyberstalking, harassment, hate speech and exploitation of women and girls are essential.
2021/07/15
Committee: FEMM
Amendment 34 #
Proposal for a regulation
Recital 5
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. Given that online platforms are part of our everyday-life and have become indispensable, even more so since the pandemic, the spread of illegal and harmful content, such as child sexual abuse material, online sexual harassment, unlawful non-consensual sharing of private images and videos, cyber violence, has risen dramatically as well. Ensuring a safe space online implies targeted actions against all phenomena harmfully affecting our social life, including through an awaited proposal on how to deal with harmful but not illegal content online. _________________ 26Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
2021/07/15
Committee: FEMM
Amendment 35 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/..2021/784 of the European Parliament and of the Council29 – proposed Terand the recently adopted Regulation of the European Parliament and of the Council on a temporary derogation from certain prorvist Content Online Regulationions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number- independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. _________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/..2021/784 of the European Parliament and of the Council – proposed Tof 29 April 2021 on addressing the dissemination of terrorist Ccontent Oonline Regulation(OJ L 172, 17.5.2021, p. 79).
2021/07/15
Committee: FEMM
Amendment 37 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly anin order to underpin the general idea that what is illegal offline should also be illegal online. The concept should also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech, child sexual abuse material or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as trafficking in human beings, sexual exploitation of women and girls, the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images and videos, online stalking, grooming adolescents, online sexual harassment and other forms of gender based violence, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/15
Committee: FEMM
Amendment 47 #
Proposal for a regulation
Recital 26 a (new)
(26 a) Being aware that the intermediary services have already applied a risk assessment, there is still potential for improvement for the security and safety of all users, especially children, women, and other vulnerable groups. Therefore providers of intermediary services, more precisely online platforms and very large online platforms, shall regularly evaluate their risk assessment and, if found necessary, improve it. Given the importance of providers of intermediary services and their potential to impact social life, common rules determining how users shall behave online, should be applied.The implementation of a code of conduct should be obligatory for every provider of intermediary services covered by this Regulation.
2021/07/15
Committee: FEMM
Amendment 48 #
Proposal for a regulation
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679, the recently adopted Regulation of the European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. Member States should ensure that the competent authorities fulfil their tasks in an objective, independent and non-discriminatory manner. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content2021/784 addressing the dissemination of terrorist content online, the recently adopted Regulation of the European Parliament and of the Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
2021/07/15
Committee: FEMM
Amendment 52 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and, women and girls, as well as vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/15
Committee: FEMM
Amendment 53 #
Proposal for a regulation
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. Providers offering their services in more than one Member State should provide a breakdown of the information by Member State. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40Aligned with the annual reports broken down by actions of content moderation and Member State, the results of all forms of violence against women and girls online, hate speech and of other illegal content should reappear in the crime statistics. All forms of violence against women and girls shall be reported as an own category in those criminal statistics and law enforcement entities shall list them separately. _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/15
Committee: FEMM
Amendment 57 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the gender equality principle and the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
2021/07/15
Committee: FEMM
Amendment 63 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising reproducing stereotypical content with an impact on the equal treatment and opportunities of citizens against the gender equality principle. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/15
Committee: FEMM
Amendment 66 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material, unlawful non-consensual sharing of private images and videos, online stalking or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the gender equality principle with the right to non-discrimination and the rights of the child. The social dimension, as online platforms play a major role in our everyday-life, is also affected by phenomena as online harassment and cyber violence. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform, including when algorithms are misinformed causing widening of gender gaps, or the misuse of their service through the submission of abusive notices or other methods for silencing speech, causing harm, such as long term mental health damage, psychological damage and societal damage, or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/15
Committee: FEMM
Amendment 77 #
Proposal for a regulation
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should be obliged to regularly review their algorithms to minimise such negative consequences and should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoyhave alternative options for the main parameters, including options that are not based on profiling of the recipienta visible, user-friendly and readily available option to turn off algorithmic selection with the recommender system entirely and options that are not based on profiling of the recipient. The gender- based algorithm bias must be prevented to avoid discriminatory impact on women and girls.
2021/07/15
Committee: FEMM
Amendment 84 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as gender equality, including equality between women and men, and non- discrimination and non- discrimination, eradicating all forms of violence against women and girls, including online violence, harassment and sexual exploitation, online stalking, child abuse, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/07/15
Committee: FEMM
Amendment 89 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including equality, are effectively protected.
2021/07/15
Committee: FEMM
Amendment 90 #
Proposal for a regulation
Article 1 – paragraph 5 – point d
(d) Regulation (EU) …/…. on prevent2021/784 on addressing the dissemination of terrorist content online [TCO once adopted];
2021/07/15
Committee: FEMM
Amendment 91 #
Proposal for a regulation
Article 1 – paragraph 5 – point d a (new)
(d a) Regulation of the European Parliament and of the European Council on a temporary derogation from certain provisions of Directive 2002/58/EC of the European Parliament and of the Council as regards the use of technologies by number-independent interpersonal communications service providers for the processing of personal and other data for the purpose of combatting child sexual abuse online
2021/07/15
Committee: FEMM
Amendment 98 #
Proposal for a regulation
Article 12 – title
Terms and conditions, code of conduct
2021/07/15
Committee: FEMM
Amendment 101 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, non-discriminatory and transparent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
2021/07/15
Committee: FEMM
Amendment 103 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2 a. Providers of intermediary services shall be obliged to include on their platforms a code of conduct, setting out behavioral rules for their users. These rules shall be publicly accessible in an easy format and shall be set out in clear and unambiguous language.
2021/07/15
Committee: FEMM
Amendment 107 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include breakdowns at Member States level and, in particular, information on the following, as applicable:
2021/07/15
Committee: FEMM
Amendment 135 #
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, gender equality, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/15
Committee: FEMM
Amendment 138 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content or the content that risks an increase in online violence and of information that is incompatible with their terms and conditions.
2021/07/15
Committee: FEMM
Amendment 148 #
(b) targeted measures aimed at limiting the display of advertisements or harmful content in association with the service they provide;
2021/07/15
Committee: FEMM