BETA

Activities of Silvia MODIG related to 2020/0361(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC
2021/10/13
Committee: FEMM
Dossiers: 2020/0361(COD)
Documents: PDF(302 KB) DOC(213 KB)
Authors: [{'name': 'Jadwiga WIŚNIEWSKA', 'mepid': 124877}]

Amendments (28)

Amendment 36 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable, accessible (including persons with disabilities) and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and alsowithin the principle that what is illegal offline should also be illegal online. The concept should covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as trafficking in human beings and online sexual exploitation of women and girls, forced marriages, the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, hacking, doxing, mobbing, sextortion, grooming adolescents, online sexual harassment, impersonation, cyberbullying, cyber sexual harassment, intersectional hate speech online against women and LGBTIQ+ persons, the sale of non- compliant or counterfeit products, the non- authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/15
Committee: FEMM
Amendment 41 #
Proposal for a regulation
Recital 12 a (new)
(12 a) Online violence is a phenomenon which needs to be efficiently addressed for the safety of all users. It affects women and girls disproportionately not only causing them psychological harm and physical suffering but also deterring them from digital participation in political, social, cultural and economic life. As in real life, women, and particularly those women with intersecting identities and vulnerabilities, experience on the internet a continuum of aggressions that ranges from unwanted sexual advances, sexist and/or racist (ageist/ableist/homophobic, transphobic/sizeist etc.) insults, to frequent, harmful, frightening, sometimes life-threatening abuse. Research by the World Health Organization estimates that one in ten women has already experienced a form of cyber violence since the age of 15. In addition, the growing trend of online violence and abuse against women has accelerated during Covid-19. Access to the internet is fast becoming a necessity for economic well-being and must be viewed as a fundamental human right. In this regard, it is crucial to ensure that the digital public space is a safe and empowering place for everyone, particularly for women and girls in all their diversity.
2021/07/15
Committee: FEMM
Amendment 44 #
Proposal for a regulation
Recital 12 b (new)
(12 b) The Covid-19 pandemic has had a significant impact on almost all spheres of life, including on organised crime. For example, traffickers increasingly moved online for every phase of trafficking which has a gender dimension since women and girls are the majority (75%) of victims of trafficking for all purposes. Perpetrators of human trafficking for sexual exploitation are using cyberspace to recruit, advertise and exercise control over women and children, who are intrinsically more vulnerable to this crime. Other forms of organised crime facilitated by digital tools are different types of exploitation, forced begging, forced and sham marriages, forced criminality, the removal of organs and illegal adoption of children.
2021/07/15
Committee: FEMM
Amendment 61 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens in particular with regard to gender equality as well as LGBTIQ+ equality and racial equality. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/15
Committee: FEMM
Amendment 65 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material, unlawful non-consensual sharing of private images, online stalking, revenge porn, unsolicited receiving of sexually explicit materials, doxing, cyberbullying, rape threats or illegal hate speech, including online sexist and anti- LGBTIQ+ hate speech or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non- discrimination and the rights of the childthe right to equality between women and men, the rights of the child and the right to full control over personal data and information online with a rejection of practices by private companies to use data for profit and to manipulate behaviour, as well as surveillance practices by any actor to control and restrict women’s speech and activism. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition or putting gender equality at risk. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/15
Committee: FEMM
Amendment 71 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, including gender equality so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They may alsoshould include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms mayshould reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They mayshould also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers and women’s rights and human rights organizations, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and gender equality and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/15
Committee: FEMM
Amendment 72 #
Proposal for a regulation
Recital 58 a (new)
(58 a) Transparency and effectivity of processes is the key to make online platforms safer to use and tackling online violence and illegal content. The actions of online platform’s decisions on how they act, or not, to remove illegal, abusive and harmful content vary hugely and some of the reports can remain unanswered. There must be an easily accessible knowledge for all users on how and why the content is being removed. These processes needs to be fully transparent. Very large online platforms should actively report and publish meaningful data on how they handle gender and other identity-based violence and they should share this information in an easy and accessible way on their platforms on an annual basis. This should include the number of reports they receive per year, and also the number of reports they receive that failed to receive any response from them, disaggregated by the category of the illegal, harmful and abusive content being reported. Very large platforms should ensure that experts and academics have access to the relevant data, e.g. to enable them to compare and evaluate how measures are working in order to gain a better understanding of the extent of the problem. They should also link their measures to international human rights, regularly evaluate, and update the implementation of their own ethical standards.
2021/07/15
Committee: FEMM
Amendment 73 #
Proposal for a regulation
Recital 58 b (new)
(58 b) There is a need for sufficient resources to remove illegal, abusive and harmful content on the services that very large platforms offer to their users and a need for transparency on how and how much the very large online platforms educate their workers on this issue. Neither users nor trusted flaggers are fully capable of providing the effective moderation needed to ensure that the very large online platforms become safer. Very large online platforms should actively share information on the number of content moderators they employ, including the number of moderators employed per region and per language. Continuous training is also needed on issues such as gender, anti-Roma and anti-LGBTIQ+ discrimination and other identity-based violence and racism. Very large online platforms should actively share the information on how moderators are trained to identify gender and other identity-based violence and abuse against users, as well as how moderators are trained about international human rights standards and about their responsibility to respect the rights of users on their platform. They should also actively involve human rights organisations in their training programmes.
2021/07/15
Committee: FEMM
Amendment 74 #
Proposal for a regulation
Recital 58 c (new)
(58 c) The content of very large online platforms needs to be fully and easily accessible to all of their users. This can be achieved by implementing user- friendly measures into the services that very large online platforms offer. Very large online platforms should present their terms of service in machine-readable format and also make all their previous versions of their terms of service easily accessible to the public, including by persons with disabilities. Options to report potentially illegal, abusive and harmful content must be easy to find and to use in the native language of the user. Information on support for persons affected and on national contact points should be easily achievable. Very large online platforms should offer and evolve easily accessible services to all users in these kind of and similar cases. They should also make moderation as easy as possible, with the help of tools, training etc. for people administrating and moderating online groups that are using their platforms and services. They should also improve and ensure accessibility of elements and functions of their services for persons with disabilities.
2021/07/15
Committee: FEMM
Amendment 76 #
Proposal for a regulation
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. These algorithms may lead to setbacks in gender equality, such as an increase in cases of gender and racial discrimination, cyber violence against women and LGBTIQ+ persons, or they can exacerbate toxic masculinity as well as amplify existing harmful gender stereotypes, including ableist, ageist and racial stereotyping. Consequently, very large online platforms should be obliged to regularly review their algorithms in an intersectional gender-responsive manner in order to minimise any negative consequences on gender equality and should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible and accessible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. They should provide meaningful information about the algorithmic tools they use in content moderation and content curation. Very large online platforms should also offer easily accessible and comprehensive explanations that allow users to understand when, why, for which tasks, and to which extent algorithmic tools are used. They should let users in an easy and accessible way to accept or not the algorithms being used in the provided service. They should allow independent researchers and relevant regulators to audit their algorithmic tools to make sure they are used as intended.
2021/07/15
Committee: FEMM
Amendment 78 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. Disinformation, especially political disinformation has become a huge problem and very large online platforms have more and more become the platforms of sharing this kind of content, especially via advertising. Very large online platforms should take out extremist actors in consultation with independent experts, in case of repeated violations. Very large online platforms should implement comprehensive and verifiable standards and measures to limit the scope of extremist actors and purposeful disinformation.
2021/07/15
Committee: FEMM
Amendment 82 #
Proposal for a regulation
Recital 88
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. This advisory group should strive to achieve a gender- balanced representation in its composition. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State.
2021/07/15
Committee: FEMM
Amendment 83 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmapersonship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and particular gender equality, LGBTIQ+ equality, and non-discrimination, gender and other identity-based online violence and harassment, and non- discriminationonline stalking, online sex trafficking, child abuse, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/07/15
Committee: FEMM
Amendment 88 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, accessible, including persons with disabilities, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including non- discrimination and gender equality, are effectively protected.
2021/07/15
Committee: FEMM
Amendment 102 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2 a. Very large online platforms as defined in Article 25(1) shall publish their terms and conditions in all official languages of the Union. They shall present their terms and conditions in machine-readable format and make all previous versions of their terms and conditions easily accessible to the public, including by persons with disabilities. Options to report potentially illegal, abusive and harmful content shall be easy to find and use in the native language of the user. They shall also make moderation as easy as possible, with the help of tools, training etc. for people administrating and moderating online groups using their platforms and services.
2021/07/15
Committee: FEMM
Amendment 115 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal or harmful content. Those mechanisms shall be easy to access, user- friendly, gender-responsive and allow for the submission of notices exclusively by electronic means.
2021/07/15
Committee: FEMM
Amendment 132 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to equality between women and men and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/15
Committee: FEMM
Amendment 133 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on gender equality or on the protection of public health (including mental health), minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/15
Committee: FEMM
Amendment 136 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content or the content risking an increase in online violence, gender and other identity-based violence and discrimination and disinformation against women, LGBTIQ+ persons and persons with disabilities or deepening the marginalization of vulnerable communities who are often targets of hate speech online and of information that is incompatible with their terms and conditions.
2021/07/15
Committee: FEMM
Amendment 139 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
2 a. Very large online platforms shall regularly review their algorithms in an intersectional gender-responsive manner in order to minimise negative consequences on gender equality such as an increase in cases of online violence and abuse against women, girls and LGBTIQ+ persons, and consequently physical violence, or the promotion of contents spreading harmful gender stereotypes, including ableist, ageist and racial stereotyping. Very large online platforms shall implement comprehensive and verifiable standards and measures to limit the scope of extremist actors and purposeful disinformation.
2021/07/15
Committee: FEMM
Amendment 140 #
Proposal for a regulation
Article 26 – paragraph 2 b (new)
2 b. Very large online platforms shall offer easily accessible explanations that allow users to understand when, why, for which tasks, and to which extent algorithmic tools are used. They shall let users in an easy and accessible way to accept or not the algorithms being used in their provided platforms and services. They shall allow independent researchers and relevant regulators to audit their algorithmic tools to make sure they are used as intended.
2021/07/15
Committee: FEMM
Amendment 141 #
Proposal for a regulation
Article 26 – paragraph 2 c (new)
2 c. Very large online platforms shall yearly share information on the number of content moderators they employ, including the number of moderators employed per region and by language. They shall also share information on how moderators are trained to identify gender and other identity-based violence, disinfomation and abuse against users, as well as how moderators are trained about international human rights standards and their responsibility to respect the rights of the users on their platforms and services. They shall also actively involve human rights organisations in their training programmes.
2021/07/15
Committee: FEMM
Amendment 142 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective and gender-responsive mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures mayshall include, where applicable:
2021/07/15
Committee: FEMM
Amendment 145 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions, including gender equality;
2021/07/15
Committee: FEMM
Amendment 146 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of advertisements or harmful content or the discriminatory display of advertising with an impact on gender equality in association with the service they provide;
2021/07/15
Committee: FEMM
Amendment 161 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may inviteshall request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations, women’s rights organizations, LGBTIQ+ support groups and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/15
Committee: FEMM
Amendment 163 #
Proposal for a regulation
Article 37 – paragraph 4 – point e
(e) safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and, the right to equality between women and men, the right to non- discrimination and the rights of the child;
2021/07/15
Committee: FEMM
Amendment 164 #
Proposal for a regulation
Article 48 – paragraph 5
5. The Board may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate, in areas such as equality, in particular gender equality, LGBTIQ+ equality and non-discrimination, gender and other identity-based online violence and harassment, online sex trafficking, online stalking, child abuse, disinformation, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the Union budget as regards custom duties, or consumer protection, if necessary for the performance of its tasks. The Board shall make the results of this cooperation publicly available and easily accessible, including persons with disabilities.
2021/07/15
Committee: FEMM