BETA

Activities of Kira Marie PETER-HANSEN related to 2020/0361(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC
2021/10/13
Committee: FEMM
Dossiers: 2020/0361(COD)
Documents: PDF(302 KB) DOC(213 KB)
Authors: [{'name': 'Jadwiga WIŚNIEWSKA', 'mepid': 124877}]

Amendments (17)

Amendment 50 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors, women and vulnerable users, such as those with protected characteristics under Article 21 of the Charter, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/15
Committee: FEMM
Amendment 58 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination, the right to gender equality and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content.
2021/07/15
Committee: FEMM
Amendment 64 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens, in particular with regard to gender equality. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/15
Committee: FEMM
Amendment 68 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination, the right to gender equality and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/15
Committee: FEMM
Amendment 87 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including gender equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/07/15
Committee: FEMM
Amendment 104 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2 a. Terms and conditions of providers of intermediary services shall respect the essential principles of human rights as enshrined in the Charter and international law
2021/07/15
Committee: FEMM
Amendment 114 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them, in at least any of the official EU languages the notifier may wish, of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means.
2021/07/15
Committee: FEMM
Amendment 117 #
Proposal for a regulation
Article 14 – paragraph 5 a (new)
5 a. The provider of intermediary services shall also notify the recipient who provided the information, where contact details are available, giving them the opportunity to reply before taking a decision, unless this would obstruct the prevention and prosecution of serious criminal offences.
2021/07/15
Committee: FEMM
Amendment 118 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6 a. Upon receipt of a valid notice, providers of hosting services shall act expeditiously to disable access to content which is manifestly illegal.
2021/07/15
Committee: FEMM
Amendment 119 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
6 b. The provider of hosting services shall ensure that processing of notices is undertaken by qualified staff to whom adequate initial and ongoing training on the applicable legislation and international human rights standards, including anti-discrimination, as well as appropriate working conditions are to be provided, including, where relevant, professional support, qualified psychological assistance and legal advice.
2021/07/15
Committee: FEMM
Amendment 127 #
Proposal for a regulation
Article 24 a (new)
Article 24 a Protections against image-based sexual abuse Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure (a) that users who disseminate content have verified themselves through a double opt-in e-mail and cell phone registration; (b) professional human-powered content moderation in line with Article 14(6d), where content having a high probability of being illegal, such as content depicting to be voyeuristic or enacting rape scenes, is reviewed; (c) the accessibility of an anonymous qualified notification procedure in the form that additionally to the mechanism referred to in Article 14 and respecting the same principles with the exception of paragraph 5a of that Article, individuals may notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity; content notified through this procedure shall be considered manifestly illegal in terms of Article 14(6a) and shall be suspended within 48 hours.
2021/07/15
Committee: FEMM
Amendment 129 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter,on an ongoing basis, the probability and severity of any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/15
Committee: FEMM
Amendment 131 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights listed in the Charter, in particular the right to respect for private and family life, freedom of expression and information, the prohibition of discrimination, the right to gender equality and the rights of the child, as enshrined in Articles 7, 11, 21, 23 and 24 of the Charter respectively;
2021/07/15
Committee: FEMM
Amendment 147 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of advertisements or illegal content in association with the service they provide;
2021/07/15
Committee: FEMM
Amendment 160 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations, including civil society organisations working on gender equality, experts on fundamental rights and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/15
Committee: FEMM
Amendment 165 #
Proposal for a regulation
Article 48 – paragraph 5
5. The Board may invite experts and observers to attend its meetings, in particular on fundamental rights and gender equality, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available.
2021/07/15
Committee: FEMM
Amendment 166 #
Proposal for a regulation
Article 48 – paragraph 5 a (new)
5 a. The composition of the Board shall be gender balanced.
2021/07/15
Committee: FEMM