BETA

Activities of Irena JOVEVA related to 2020/0361(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
2021/10/05
Committee: CULT
Dossiers: 2020/0361(COD)
Documents: PDF(421 KB) DOC(265 KB)
Authors: [{'name': 'Sabine VERHEYEN', 'mepid': 96756}]

Amendments (59)

Amendment 119 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and, fundamental rights to privacy and to the protection of personal data, freedom and pluralism of media, the freedom to conduct a business, and the right to non- discrimination.
2021/07/23
Committee: CULT
Amendment 142 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadunderpin the general idea that what is illegal offline should also be illegal online, while ensuring that what is legal offline should also be legal online. The concept of “illegal content” should be defined appropriately and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use or illegal dissemination of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/23
Committee: CULT
Amendment 145 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks, content-sharing platforms, search engines, livestreaming platforms, messaging services or online marketplaces,, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/07/23
Committee: CULT
Amendment 147 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited or large number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/23
Committee: CULT
Amendment 150 #
(17) The relevant rules of Chapter II should only establish when the provider of intermediary services concerned cannot be held liable in relation to illegal content provided by the recipients of the service. Those rules should not be understood to provide a positive basis for establishing when a provider can be held liable, which is for the applicable rules of Union or national law to determine. Furthermore, the exemptions from liability established in this Regulation should apply in respect of any type of liability as regards any type of illegal content, irrespective of the precise subject matter or nature of those laws, however, it should not apply to matters relating to information society services covered by Regulation (EU) 2016/679 and Directive 2002/58/EC, including the liability of controllers and processors.
2021/07/23
Committee: CULT
Amendment 154 #
Proposal for a regulation
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it, including moderating or promoting content, and has knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
2021/07/23
Committee: CULT
Amendment 159 #
Proposal for a regulation
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
2021/07/23
Committee: CULT
Amendment 161 #
Proposal for a regulation
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law, and enforce with reasonable efforts terms and conditions of provider of intermediary services to limit harmful content such as disinformation, harassment, and hate speech. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.
2021/07/23
Committee: CULT
Amendment 180 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and acting responsibly in applying and enforcementing of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. Terms and conditions should be summarised in a clear, accessible and easily comprehensible manner while offering possibility of opting-out from optional clauses.
2021/07/23
Committee: CULT
Amendment 182 #
Proposal for a regulation
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should present publicly available annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions, including comprehensive anonymised statistical analysis of measures taken and the misuses of services and manifestly unfounded notices or complaints under the mechanisms established under this Regulation. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC.40 _________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/23
Committee: CULT
Amendment 192 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated meansefficient, proportionate and accurate automated means accompanied by human oversight, that provider should inform the recipient of its decision and possible notifier, the reasons for its decision and the available effective redress possibilities to rapidly contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/07/23
Committee: CULT
Amendment 195 #
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations, and are encouraged to do so. _________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/23
Committee: CULT
Amendment 200 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackldetecting, identifying and notifying illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non- governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions, including their competence and objectivity. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/23
Committee: CULT
Amendment 208 #
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfoundor incompatible with an online platforms' terms and conditions respectively that the notices or complaints are unfounded or mechanisms established under this Regulation, abused. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/23
Committee: CULT
Amendment 210 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an detailed and comprehensive explanation of its suspicion. THowever, this Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms, the confidentiality of an ongoing investigation should be protected, and comply with the protection of the data subject as it is set out in Article 23(1) and (2) of Regulation (EU) 2016/679. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/07/23
Committee: CULT
Amendment 213 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling, and move towards less intrusive forms of advertising that do not require any tracking of user interaction with content. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Additionally, online platforms should provide customers to whom they supply online advertising, when requested and to the extent possible, with information that allows customer to understand how data was processed, categories of data or criteria on the basis of which ads may appear, and data that was disclosed to advertisers or third parties, and refrain from using any aggregated or non- aggregated data, which may include anonymised and personal data without explicit data subject’s consent. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/23
Committee: CULT
Amendment 215 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, specifically regarding disinformation, online harassment, hate speech or any other types of harmful content, there being no alternative and less restrictive measures that would effectively achieve the same result.
2021/07/23
Committee: CULT
Amendment 216 #
Proposal for a regulation
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures, including by adapting algorithmic recommender systems and online interfaces, in particular as regards their potential for amplifying certain content including disinformation.
2021/07/23
Committee: CULT
Amendment 220 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition or the way platforms' terms and conditions including content moderation policies are enforced. Therefore, it is necessary to promote adequate changes in platforms' conduct, a more accountable information ecosystem, enhanced fact-checking capabilities and collective knowledge on disinformation, and the use of new technologies to improve the way information is produced and disseminated online. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/23
Committee: CULT
Amendment 227 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or limiting advertisement for third party advertisers and publishers, or other actions, such as improving the visibility of authoritative information sources, labelling of misleading content, as well as making moderation policies fully transparent to the users. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/23
Committee: CULT
Amendment 233 #
Proposal for a regulation
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative and options for the main to modify or influence those parameters, including options that are not based on profiling of the recipient. When recommender systems use profiling of the recipient they should at least provide to the recipient a description of the basis upon which profiling is performed, including whether personal data and data derived from user activity is relied on, the processing applied, the purpose for which the profile is prepared and eventually used, and the impact of such profiling, as well as to seek their explicit consent in a user-friendly manner.
2021/07/23
Committee: CULT
Amendment 237 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
2021/07/23
Committee: CULT
Amendment 244 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal or harmful content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation and propagation of fake or misleading information, sometimes with a purpose of obtaining economic or political gain, which are particularly harmful for vulnerable recipients of the service, such as childrenrecipients of the service. Another area for consideration is to improve transparency regarding the origin of information and the way it is produced, sponsored, disseminated and targeted, to promote diversity of information through support of high quality journalism and relation between information creators and distributors, and to foster credibility of information by providing an indication of its trustworthiness, and improving traceability of information of influential information providers. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
2021/07/23
Committee: CULT
Amendment 257 #
Proposal for a regulation
Article 1 – paragraph 5 – point b a (new)
(b a) Directive (EU)2019/790;
2021/07/23
Committee: CULT
Amendment 266 #
Proposal for a regulation
Article 2 – paragraph 1 – point g a (new)
(g a) ‘Profiling’ means any form of automated processing of personal data as defined in point 4 of Article 4 of Regulation (EU) 2016/697;
2021/07/23
Committee: CULT
Amendment 267 #
Proposal for a regulation
Article 2 – paragraph 1 – point g b (new)
(g b) ‘Personal data’ means any information as defined in point 1 of Article 4of Regulation (EU) 2016/679;
2021/07/23
Committee: CULT
Amendment 270 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest, rank, prioritise, select and display in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
2021/07/23
Committee: CULT
Amendment 283 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
— information about redress available to the provider of the service and to the recipient of the service who provided the content including information about effective remedy;
2021/07/23
Committee: CULT
Amendment 291 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 1
— a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;
2021/07/23
Committee: CULT
Amendment 294 #
Proposal for a regulation
Article 9 – paragraph 2 – point a – indent 2
— information about content of the order and redress available to the provider and to the recipients of the service concerned;
2021/07/23
Committee: CULT
Amendment 304 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2 a. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies available such as to modify or influence main parameters of recommender systems and advertisement options not based on profiling;
2021/07/23
Committee: CULT
Amendment 313 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, separately for each Member State, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders;
2021/07/23
Committee: CULT
Amendment 315 #
Proposal for a regulation
Article 13 – paragraph 1 – point b a (new)
(b a) Number of fact-checkers, content moderators, and trusted flaggers reporting for each Member States accompanied by statistical analysis on the use made of automated means and human oversight of such means;
2021/07/23
Committee: CULT
Amendment 316 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed, including decisions reversed based on redress possibilities.
2021/07/23
Committee: CULT
Amendment 331 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessarypossible, additional information enabling the identification of the illegal content;
2021/07/23
Committee: CULT
Amendment 338 #
Proposal for a regulation
Article 14 – paragraph 5
5. The provider shall also, without undue delay, notify that individual or entity whose content was removed or challenged of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. The provider shall ensure that decision-making process is reviewed and final possible action taken by a qualified staff regardless of the automated means used;
2021/07/23
Committee: CULT
Amendment 341 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4 and 5.
2021/07/23
Committee: CULT
Amendment 348 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient and notifier, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
2021/07/23
Committee: CULT
Amendment 349 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means in takaccompanying the decision, including where the decision was taken in respect of content detected or identified using automated means;
2021/07/23
Committee: CULT
Amendment 360 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means and have adequate human oversight.
2021/07/23
Committee: CULT
Amendment 379 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform, political parties or commercial interest;
2021/07/23
Committee: CULT
Amendment 382 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, and inform the Board and other Digital Services Coordinators, providing the necessary explanations and supporting documents.
2021/07/23
Committee: CULT
Amendment 384 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger. Before revoking that status the Digital Services Coordinator shall inform the Board and other Digital Services Coordinators of the decision made regarding revoking the status of trusted flagger.
2021/07/23
Committee: CULT
Amendment 386 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, or otherwise restrict, for a reasonable period of time and after having issued a prior warning, the provision or some features of their services to recipients of the service that frequently provide manifestly illegal content.
2021/07/23
Committee: CULT
Amendment 389 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platforms shallmay suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that abuse notice and action mechanism and internal complaints-handling systems, frequently submitting notices or complaints that are manifestly unfounded.
2021/07/23
Committee: CULT
Amendment 397 #
Proposal for a regulation
Article 20 – paragraph 4
4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension or other restrictions of services on recipients of service.
2021/07/23
Committee: CULT
Amendment 398 #
Proposal for a regulation
Article 21 – paragraph 1
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available and quickly accessed to the online platform.
2021/07/23
Committee: CULT
Amendment 400 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
(b) the number of suspensions or other restrictions of services imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints and presented separately by means identified, namely out-of-court disputes, notice and action mechanism or orders from judicial or administrative authority;
2021/07/23
Committee: CULT
Amendment 402 #
Proposal for a regulation
Article 23 – paragraph 1 – point c
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied including human oversight and decisions made.
2021/07/23
Committee: CULT
Amendment 404 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and by which advertising agency or publishers managing advertisement campaigns, including criteria used by the ad-tech platform services such as pricing mechanisms, advertising auctions and their weighting, fees charged by ad exchanges, and the identity of the natural or legal person(s) responsible for the possible automated system;
2021/07/23
Committee: CULT
Amendment 406 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed., including how the information is ranked and prioritised by algorithmically suggesting on users online interfaces in an easily comprehensive manner;
2021/07/23
Committee: CULT
Amendment 407 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
(c a) Providers of intermediary services shall, by default, not make the recipients of their services subject to targeted, micro targeted and behavioural advertising unless the recipient of the service has explicitly given consent from the data subject via opt-in.
2021/07/23
Committee: CULT
Amendment 423 #
Proposal for a regulation
Article 27 – paragraph 1 – point b
(b) targeted measures aimed at limiting the display of advertisements in association with the service they provide, limiting providers of disinformation and monetisation of fake news, limiting reach of advertisement and advertisements identified as posing risk pursuant to Article 26;
2021/07/23
Committee: CULT
Amendment 430 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparBefore adopting those guidelines the Commission shall organise public consultations and ask for the consent of the European Parliament.
2021/07/23
Committee: CULT
Amendment 437 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, which shall be made default, while other options shall require explicit consent of the recipient to opt-in to profiling and chose main parameters for recommender system, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/07/23
Committee: CULT
Amendment 438 #
Proposal for a regulation
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. When user creates account the settings for recommender systems shall be make default, not based on profiling, and give the user in easily comprehensible manner a choice to set the main parameters to be used in recommender systems.
2021/07/23
Committee: CULT
Amendment 448 #
Proposal for a regulation
Article 30 – paragraph 2 – point e a (new)
(e a) whether the aggregated or non- aggregated data for advertisement purposes that is provided for or generated in the context of the use of the relevant services which came from third parties, in particular with regard to ad inventory and intermediation services owned by other publishers or service providers connected with the platform.
2021/07/23
Committee: CULT
Amendment 459 #
Proposal for a regulation
Article 32 – paragraph 1
1. Very large online platforms shall appoint one or more compliance officers, for every Member State in the official language of concerning state, responsible for monitoring their compliance with this Regulation.
2021/07/23
Committee: CULT
Amendment 463 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations, the European Parliament and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/23
Committee: CULT