Activities of Mikuláš PEKSA related to 2020/0361(COD)
Plenary speeches (1)
Digital Services Act (debate)
Opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Shadow opinions (1)
OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Amendments (110)
Amendment 74 #
Proposal for a regulation
Recital 3
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the right to privacy and the right to protection of personal data, the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination.
Amendment 77 #
Proposal for a regulation
Recital 5
Recital 5
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activitieresponsibility to uphold fundamental rights. _________________ 26Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
Amendment 82 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and alsto covers information relating to illegal content, products, services and activities following the Member State of origin principle. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 91 #
Proposal for a regulation
Recital 14
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to meanAccordingly, where access to information requires registration or admittance to a group of users, that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined personshould be considered to be disseminated to the public only where users seeking to access the information are automatically registered or admitted without a human decision or selection of whom to grant access. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation, as they are not considered to be disseminated to the public. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. _________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
Amendment 92 #
Proposal for a regulation
Recital 15 a (new)
Recital 15 a (new)
(15 a) The general collection of personal data concerning every use of a digital service interferes disproportionately with the right to privacy in the digital age. In line with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data, recipients should have the possibility to access information society services and pay for information society services anonymously wherever technically possible. Similarly users have a right not to be subject to tracking when using information society services. To this end, the processing of personal data concerning the use of services should be limited to the extent strictly necessary to provide the service and to bill the users.
Amendment 93 #
Proposal for a regulation
Recital 15 b (new)
Recital 15 b (new)
(15 b) Applying effective end-to-end encryption to data is essential for trust in and security on the Internet, as it effectively prevents unauthorised third party access and helps to ensure confidentiality of communications.
Amendment 96 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give itthe provider of intermediary services has actual knowledge of, or meaningful control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
Amendment 102 #
Proposal for a regulation
Recital 21
Recital 21
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved within the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.
Amendment 106 #
Proposal for a regulation
Recital 22
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
Amendment 108 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
Amendment 110 #
Proposal for a regulation
Recital 24
Recital 24
(24) The exemptions from liability established in this Regulation should not affect the possibility of injunctions of different kinds against providers of intermediary services, even where they meet the conditions set out as part of those exemptions. Such injunctions could, in particular, consist of orders by courts or administrative authorities requiring the termination or prevention of any infringement, including the removal of illegal content specified in such orders, issued in compliance with Union law, or the disabling of access to it.
Amendment 111 #
Proposal for a regulation
Recital 25
Recital 25
Amendment 117 #
Proposal for a regulation
Recital 27
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, cloud infrastructure services or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
Amendment 119 #
Proposal for a regulation
Recital 28
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature nor should they use automated tools for content moderation. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
Amendment 138 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 152 #
Proposal for a regulation
Recital 47
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimesUnder certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be openreadily available to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
Amendment 165 #
Proposal for a regulation
Recital 50 a (new)
Recital 50 a (new)
Amendment 166 #
Proposal for a regulation
Recital 51
Recital 51
(51) In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipients of the service in the Union, in standardised formats and through standardised Application Programming Interfaces.
Amendment 167 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. The surveillance-led advertising model has generated deep changes in the way information is presented and has created new data collection patterns and business models that might negatively affect privacy, personal autonomy, democracy, quality news reporting and facilitates manipulation and discrimination. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure data collection is kept to a minimum, the maximisation of revenue from advertising does not limit the quality of the service and that the recipients of the service have certainextensive individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision- making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 174 #
Proposal for a regulation
Recital 53
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concernchallenges to fundamental rights, there being no alternative and less restrictive measures that would effectively achieve the same result.
Amendment 178 #
Proposal for a regulation
Recital 56
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they canwere able to set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
Amendment 182 #
Proposal for a regulation
Recital 57
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
Amendment 184 #
(60) Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent external auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement.
Amendment 187 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the on the use of recommender systems, and that recipients can easily control the way information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. TheyVery large online platforms should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
Amendment 190 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 191 #
Proposal for a regulation
Recital 65 a (new)
Recital 65 a (new)
(65 a) Interoperability requirements for very large online platforms are desirable as they can create new opportunities for the development of innovative services, overcome the lock-in effect of closed platforms and ensure competition and user choice. These requirements should allow recipients to benefit from cross- platform interaction. Very large online platforms should provide an application programming interface through which third-party platforms and their recipients can interoperate with the main functionalities and recipients of the core services offered by the platform. Among the main functionalities can be the ability to receive information from certain accounts, to share provided content and react to it. The interoperability requirements do not prevent platforms from offering non-core additional features to their recipients.
Amendment 193 #
Proposal for a regulation
Recital 68
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanationsobjective reasons, such as technological incompatibility, by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
Amendment 194 #
Proposal for a regulation
Recital 69
Recital 69
Amendment 206 #
Proposal for a regulation
Recital 95
Recital 95
(95) In order to address those public policy concerns it is therefore necessary to provide for a common system of enhanced supervision and enforcement at Union level. Once an infringement of one of the provisions that solely apply to very large online platforms has been identified, for instance pursuant to individual or joint investigations, auditing or complaints, the Digital Services Coordinator of establishment, upon its own initiative or upon the Board’s advice, should monitor any subsequent measure taken by the very large online platform concerned as set out in its action plan. That Digital Services Coordinator should be able to ask, where appropriate, for an additional, specific audit to be carried out, on a voluntary basis, to establish whether those measures are sufficient to address the infringement. At the end of that procedure, it should inform the Board, the Commission and the platform concerned of its views on whether or not that platform addressed the infringement, specifying in particular the relevant conduct and its assessment of any measures taken. The Digital Services Coordinator should perform its role under this common system in a timely manner and taking utmost account of any opinions and other advice of the Board.
Amendment 207 #
Proposal for a regulation
Recital 97
Recital 97
(97) The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. However, it should justify any inaction. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary.
Amendment 210 #
Proposal for a regulation
Article 1 – paragraph 2 – point a
Article 1 – paragraph 2 – point a
(a) contribute to the proper functioning of the internal market for intermediary services and encourage competition;
Amendment 219 #
Proposal for a regulation
Article 1 – paragraph 5 a (new)
Article 1 – paragraph 5 a (new)
5 a. This Regulation shall not apply to matters relating to information society services covered by Regulation (EU) 2016/679 and Directive 2002/58/EC.
Amendment 229 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State in which it is hosted, irrespective of the precise subject matter or nature of that law;
Amendment 244 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Article 2 a Privacy protection User profiling done by information society service providers shall be conducted only on the basis of the data provided with the user´s explicit and informed consent, as defined in Article 4(11) of Regulation (EU) 2016/679. Any profiling must be done only in relation to users of the service. Information society service providers shall not profile individuals who are not users of the service. Users shall not be profiled regarding their racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data, health, sex life or sexual orientation.
Amendment 245 #
Proposal for a regulation
Article 3 – paragraph 3
Article 3 – paragraph 3
3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 251 #
Proposal for a regulation
Article 4 – paragraph 1 – point e
Article 4 – paragraph 1 – point e
(e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the informationillegal content at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.
Amendment 254 #
Proposal for a regulation
Article 4 – paragraph 2
Article 4 – paragraph 2
2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 261 #
Proposal for a regulation
Article 5 – paragraph 4
Article 5 – paragraph 4
4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
Amendment 263 #
Proposal for a regulation
Article 6
Article 6
Amendment 269 #
Proposal for a regulation
Article 7 – title
Article 7 – title
No general monitoring or active fact- findingutomated content moderation obligations
Amendment 270 #
Proposal for a regulation
Article 7 – paragraph 1 a (new)
Article 7 – paragraph 1 a (new)
Providers of intermediary services should never be obliged to use automated content moderation and the non-user of such technologies should not be considered as an aggravating factor when attributing liability.
Amendment 274 #
Proposal for a regulation
Article 8 – paragraph 1
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
Amendment 276 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
Article 8 – paragraph 2 – point a – indent 3
— information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content;
Amendment 280 #
Proposal for a regulation
Article 10 – paragraph 3 a (new)
Article 10 – paragraph 3 a (new)
3 a. Any requests to providers of intermediary services, made on the basis of this legislation, shall be transmitted through the Digital Service Coordinator in the Member State of establishment, who is responsible for collecting requests and communication from all relevant sources.
Amendment 289 #
Proposal for a regulation
Article 12 – paragraph 2
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objectivetransparent, non- discriminatory, coherent, predictable, diligent, non-arbitrary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
Amendment 294 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2 a. Terms and conditions of providers of intermediary services shall respect the essential principles of fundamental rights as enshrined in the Charter and international law.
Amendment 296 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
Article 12 – paragraph 2 b (new)
2 b. Terms and conditions that do not comply with this Article shall not be binding on recipients.
Amendment 297 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
Article 12 – paragraph 2 c (new)
2 c. All changes in terms and conditions should fully comply with this article. Intermediary service providers should inform the users of all changes in terms and conditions at least a month in advance.
Amendment 298 #
2 d. Providers shall use as similar terms and conditions in the whole single market as possible, with divergences being clearly marked and justified.
Amendment 302 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
Article 13 – paragraph 1 – point c
(c) the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures, as well as the measures taken to train content moderators and the safeguards to ensure that non-infringing content is not affected;
Amendment 331 #
Proposal for a regulation
Article 14 – paragraph 2 – point c a (new)
Article 14 – paragraph 2 – point c a (new)
(c a) where an alleged infringement of an intellectual property right is notified, evidence that the entity submitting the notice is the rights holder of the intellectual property right that is allegedly infringed or is authorised to act on behalf of that rights holder;
Amendment 333 #
Proposal for a regulation
Article 14 – paragraph 3
Article 14 – paragraph 3
Amendment 335 #
Proposal for a regulation
Article 14 – paragraph 4 a (new)
Article 14 – paragraph 4 a (new)
4 a. Upon receipt of the notice of alleged copyright infringement the service provider shall notify the information providers, using available contact details, of the elements referred to in paragraph 2 and give them the opportunity to reply, within minimum 5 working days, before taking a decision and, if applicable, before disabling access to the referred content.
Amendment 336 #
Proposal for a regulation
Article 14 – paragraph 4 b (new)
Article 14 – paragraph 4 b (new)
4 b. The provider shall ensure that decisions on notices are taken by qualified staff to whom adequate training as well as appropriate working conditions are to be provided, including, where necessary, the opportunity to seek professional support and qualified psychological assistance.
Amendment 337 #
Proposal for a regulation
Article 14 – paragraph 5
Article 14 – paragraph 5
5. The provider shall also, without undue delay, notify that submitting individual or entity and the information provider of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision.
Amendment 342 #
Proposal for a regulation
Article 14 – paragraph 6
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objectivenon-arbitrary manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
Amendment 353 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means used in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
Amendment 365 #
Proposal for a regulation
Article 16 – title
Article 16 – title
Exclusion for micro and, small or medium sized enterprises
Amendment 370 #
Proposal for a regulation
Article 16 – paragraph 1
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or, small or medium sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
Amendment 374 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service and qualified entities as defined in Article 3, point (4) of Directive (EU) 2020/1828 , for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
Amendment 378 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
Article 17 – paragraph 1 – point a
(a) decisions to remove or, disable access to the information; , demote, demonetise, restrict or in any other way modify access to the information or otherwise impose sanctions against it;
Amendment 383 #
Proposal for a regulation
Article 17 – paragraph 3
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objectivenon-arbitrary manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. If the complaining entity wishes so, the online platform shall also publicly confirm the reversal of decision.
Amendment 389 #
Proposal for a regulation
Article 17 – paragraph 5
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. and always includes meaningful human oversight.
Amendment 390 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Article 18 – paragraph 1 – subparagraph 1
1. Recipients of the service addressed by the decisions referred to in Article 17(1), and qualified entities as defined in Article 3, point (4) of Directive (EU) 2020/1828 shall be entitled to select any out-of- court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected by the recipient with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 393 #
Proposal for a regulation
Article 18 – paragraph 2 – point b
Article 18 – paragraph 2 – point b
(b) it has the necessary legal expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute;
Amendment 395 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 1
Article 18 – paragraph 3 – subparagraph 1
3. If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient shall notever be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
Amendment 402 #
Proposal for a regulation
Article 19 – paragraph 2 – introductory part
Article 19 – paragraph 2 – introductory part
2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established or by the Commission, where the applicant has demonstrated to meet all of the following conditions:
Amendment 411 #
Proposal for a regulation
Article 19 – paragraph 3
Article 19 – paragraph 3
3. Digital Services Coordinators and the Commission shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2.
Amendment 413 #
Proposal for a regulation
Article 19 – paragraph 5
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices or notices regarding legal content through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
Amendment 414 #
Proposal for a regulation
Article 19 – paragraph 6
Article 19 – paragraph 6
6. The Digital Services Coordinatorentity that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
Amendment 420 #
Proposal for a regulation
Article 20 – paragraph 1
Article 20 – paragraph 1
1. Online platforms shallmay suspend, only for a reasonable short period of time and after having issued a prior warning and after providing a comprehensive explanation, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
Amendment 447 #
Proposal for a regulation
Article 22 – paragraph 1 – point b
Article 22 – paragraph 1 – point b
Amendment 449 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
Article 22 – paragraph 1 – point c
Amendment 460 #
Proposal for a regulation
Article 22 – paragraph 2
Article 22 – paragraph 2
2. ThVery large online platforms shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources.
Amendment 466 #
Proposal for a regulation
Article 22 – paragraph 4
Article 22 – paragraph 4
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned, including the period for redress. They shall subsequently delete the information.
Amendment 481 #
Proposal for a regulation
Article 23 – paragraph 3 a (new)
Article 23 – paragraph 3 a (new)
3 a. Online platforms shall clearly state how and for what purpose they collect data from users of the service and how, to whom and for purpose they disseminate data collected further.
Amendment 493 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
Article 24 – paragraph 1 – point c
(c) meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed and how to change these parameters.
Amendment 529 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
Article 26 – paragraph 1 – point a
Amendment 536 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights, in particular the rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, algorithmic biases and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
Amendment 537 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) malfunctioning or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.fundamental rights
Amendment 541 #
Proposal for a regulation
Article 26 – paragraph 1 – point c a (new)
Article 26 – paragraph 1 – point c a (new)
(c a) readiness to participate in the crisis protocols referred to in Article 37;
Amendment 543 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, howthe effects of their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
Amendment 547 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
Article 26 – paragraph 2 a (new)
2 a. To ensure a high level of public control and transparency, these yearly risk assessments should be made as transparent as possible, by means of open access data. The outcome of the risk assessment and supporting documents shall be communicated to the Board and the Digital Services Coordinator of establishment. A summary version of the risk assessment shall be made publicly available in an easily accessible format.
Amendment 550 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shallmay put in place reasonable, proportionate and effective mitigation measures, tailored toeasures to cease, prevent and mitigate the specific systemic risks identified pursuant to Article 26 without disproportionately impacting fundamental rights. Such measures may include, where applicable:
Amendment 556 #
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
Article 27 – paragraph 1 – point a a (new)
(a a) ensuring appropriate staffing to deal with notices and complaints;
Amendment 559 #
Proposal for a regulation
Article 27 – paragraph 1 – point d a (new)
Article 27 – paragraph 1 – point d a (new)
(d a) targeted measures to mitigate environmental risks and promote sustainability through their services and establishing criteria for energy-sufficient digital services.
Amendment 560 #
Proposal for a regulation
Article 27 – paragraph 1 – point e
Article 27 – paragraph 1 – point e
Amendment 565 #
Proposal for a regulation
Article 27 – paragraph 2 a (new)
Article 27 – paragraph 2 a (new)
2 a. The reports referred to in paragraph 2 shall be disseminated to the general public and include standardised, open data describing the systemic risks, especially risks to fundamental rights.
Amendment 567 #
Proposal for a regulation
Article 27 – paragraph 3
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelinerecommendations on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelinerecommendations the Commission shall organise public consultations.
Amendment 575 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
Amendment 579 #
Proposal for a regulation
Article 28 – paragraph 1 a (new)
Article 28 – paragraph 1 a (new)
1 a. Audits shall be performed at least on the following: i. the clarity, coherence and predictable enforcement of terms of service with particular regard to the applicable fundamental rights as enshrined in the Charter; ii. the completeness, methodology and consistency of the transparency reporting obligations as set out in Articles 13, 23, 24 and 30 as well as respect for highest possible standards on transparency reporting; iii. accuracy, predictability and clarity of the provider's follow-up for recipients of the service and notice providers to notices of illegal content and terms of service violations and the accuracy of classification (illegal or terms and conditions violation) of removed information; iv. internal and third-party complaint handling mechanisms; v. interaction with trusted flaggers and independent assessment of accuracy, response times, efficiency and whether there are indications of abuse; vi. diligence with regard to verification of the traceability of traders; vii. the effectiveness of and compliance with codes of conduct; viii data sufficiency, aiming at the reduction of data generation, in general, and traffic, wherever possible, in particular, do so including the reduction of associated electricity consumptions and resources from data centres, as referred to in Article 27; ix. readiness to participate in the crisis protocols referred to in Article 37; Audits on the subjects mentioned in points (i) to (vii) may be combined where the organisation performing the audits has subject-specific expertise on the subject matters at hand.
Amendment 581 #
Proposal for a regulation
Article 28 – paragraph 2 – introductory part
Article 28 – paragraph 2 – introductory part
2. Audits performed pursuant to paragraphs 1 and 1a shall be performed by organisations which:
Amendment 590 #
Proposal for a regulation
Article 28 – paragraph 4 a (new)
Article 28 – paragraph 4 a (new)
4 a. The audits shall, immediately after completion, be submitted to Digital Services Coordinators, European Union Agency for Fundamental Rights and to the Commission. Audit findings, not including sensitive information, shall be made public. Digital Services Coordinators, European Union Agency for Fundamental Rights and the Commission may provide a public comment on the audits.
Amendment 597 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large oOnline platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available,. Very large online platforms must includinge at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679, as well as keep a log of all the significant changes implemented to the recommender system.
Amendment 605 #
Proposal for a regulation
Article 29 – paragraph 2
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.
Amendment 607 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
Article 29 – paragraph 2 a (new)
2 a. Only data provided voluntarily by the user and with clear consent can be used in recommender systems. Refusing consent shall be as easily visible and not more time-consuming for the recipient than to consent. The provision of an information society service by very large online platforms shall not be conditional on consent to user profiling.
Amendment 630 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. This shall include personal data only where it is lawfully accessible to the public.
Amendment 638 #
Proposal for a regulation
Article 31 – paragraph 7 a (new)
Article 31 – paragraph 7 a (new)
7 a. Research conducted under this regime should always be built on open access principles and use standardised data sets to ensure high level of transparency and accountability on proper use of provided data.
Amendment 639 #
Proposal for a regulation
Article 31 – paragraph 7 b (new)
Article 31 – paragraph 7 b (new)
7 b. Upon completion of their research, the vetted researchers that have been granted access to data shall publish their findings without disclosing personal data.
Amendment 645 #
Proposal for a regulation
Article 33 a (new)
Article 33 a (new)
Article 33 a Interoperability 1. Very large online platforms shall make the core functionalities of their services interoperable with other online platforms to enable cross-platform communication. This obligation shall not limit, hinder or delay their ability to solve security issues and should be in compliance with all their responsibilities, especially regarding fundamental rights and protection of privacy. Online platforms shall not process information obtained for the purpose of cross-platform information exchange for other purposes. 2. Very large online platforms shall publicly document all application programming interfaces they make available and update them continuously. 3. The Commission shall adopt implementing measures specifying the nature and scope of the obligations set out in paragraphs 1 and 2, taking in account not only the individual cases of different very large online providers, but also the market as a whole.
Amendment 649 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
Article 34 – paragraph 2 a (new)
2 a. In order to facilitate the implementation of national and global environmental policy, as well as to allow individuals to make informed choices and improve competition, the European Commission shall support and promote the development and implementation of standards for the reporting of the environmental impact of the provision of the services provided by very large online platforms.
Amendment 652 #
Proposal for a regulation
Article 35 – paragraph 1
Article 35 – paragraph 1
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
Amendment 657 #
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
Article 36 – paragraph 2 – point b a (new)
Amendment 661 #
Proposal for a regulation
Article 37 – paragraph 1
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. The Commission is responsible for drafting, implementation and scrutiny of the crisis protocols and shall annually report on it to the European Parliament.
Amendment 662 #
Proposal for a regulation
Article 37 – paragraph 5 a (new)
Article 37 – paragraph 5 a (new)
5 a. All crisis protocols are to be subjected to scrutiny by the European Parliament and need approval from it before being put in place.
Amendment 663 #
Proposal for a regulation
Article 37 a (new)
Article 37 a (new)
Amendment 679 #
Proposal for a regulation
Article 42 – paragraph 1
Article 42 – paragraph 1
1. Member StatesThe Commission shall lay down the rules on penalties applicable to infringements of this Regulation by providers of intermediary services under their jurisdiction and shall take all the necessary measures to ensure that they are implemented in accordance with Article 41.
Amendment 680 #
Proposal for a regulation
Article 42 – paragraph 2
Article 42 – paragraph 2
2. Penalties shall be effective, proportionate and dissuasive. Member States shall notify the Commission of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them.
Amendment 682 #
Proposal for a regulation
Article 42 – paragraph 3
Article 42 – paragraph 3
3. Member StatesThe Commission shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover of the provider concerned.
Amendment 685 #
Proposal for a regulation
Article 42 – paragraph 4
Article 42 – paragraph 4
4. Member StatesThe Commission shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned.