BETA

Activities of Brando BENIFEI related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (74)

Amendment 96 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and, the freedom to conduct a business, andprivacy and personal data protection, the right to non-discrimination and access to justice.
2021/07/20
Committee: JURI
Amendment 99 #
Proposal for a regulation
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral and outcome-oriented, innovation should not be hampered but instead be stimulated.
2021/07/20
Committee: JURI
Amendment 128 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. Similarly, a communication channel in an online game could also constitute such a feature.
2021/07/20
Committee: JURI
Amendment 149 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the Charter of Fundamental Rights of the European Union, including the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/07/20
Committee: JURI
Amendment 158 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner and accompanied by additional safeguards. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/07/20
Committee: JURI
Amendment 162 #
Proposal for a regulation
Recital 26
(26) Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed or open online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content.
2021/07/20
Committee: JURI
Amendment 168 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content or as an obligation to use automated content- filtering tools.
2021/07/20
Committee: JURI
Amendment 187 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. Children have specific rights enshrined in Article 24 of the Charter of Fundamental Rights of the European Union and in the United Nations Convention on the Rights of the Child. As such, the best interests of the child should be a primary consideration in all matters affecting them. The UNCRC General comment No. 25 on children’s rights in relation to the digital environment formally sets out how these rights apply to the digital world.
2021/07/08
Committee: IMCO
Amendment 188 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights and freedoms guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, a high level of consumer protection and the right to non- discrimination.
2021/07/08
Committee: IMCO
Amendment 188 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant and up-to- date information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
2021/07/20
Committee: JURI
Amendment 190 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. In particular, it is important to ensure that terms and conditions are fair, non- discriminatory and transparent, and are drafted in a clear and unambiguous language in line with applicable Union law. The terms and conditions should include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making, human review, the legal consequences to be faced by the users for knowingly storing or uploading illegal content as well as on the right to terminate the use of the service. Providers of intermediary services should also provide recipients of services with a concise and easily readable summary of the main elements of the terms and conditions, including the remedies available.
2021/07/20
Committee: JURI
Amendment 201 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
2021/07/20
Committee: JURI
Amendment 221 #
Proposal for a regulation
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. Dispute resolution proceedings should be concluded within a reasonable period of time. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/07/20
Committee: JURI
Amendment 226 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests or are individual rightholders and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non- governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders, as well as individual rightholders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/20
Committee: JURI
Amendment 233 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law where that is consistentin conformity with Union law and what the precise nature or subject matter is of the law in question.
2021/07/08
Committee: IMCO
Amendment 245 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Consequently, providers of services, such as cloud infrastructure, which are provided at the request of parties other than the content providers and only indirectly benefit the latter, should not be covered by this Regulation. This Regulation should cover, for example, providers of social media, video, image and audio-sharing services, as well as file-sharing services and other cloud services, insofar as those services are used to make the stored information available to the public at the direct request of the content provider. Where a service provider offers services other than hosting, this Regulation should apply only to the services that fall within its scope. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/08
Committee: IMCO
Amendment 250 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Consequently, providers of services, such as cloud infrastructure, which are provided at the request of parties other than the content providers and only indirectly benefit the latter, should not be covered by the definition of online platforms. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/08
Committee: IMCO
Amendment 271 #
Proposal for a regulation
Recital 21
(21) A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not select, rank or modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted.
2021/07/08
Committee: IMCO
Amendment 290 #
Proposal for a regulation
Recital 65 a (new)
(65 a) Recipients of a service are often locked in to existing platforms due to network effects, which significantly limits user choice. In order to facilitate free choice of recipients between different services, it is therefore important to consider interoperability for industry- standard features of very large online platforms, such as core messaging functionality or image-sharing services. Such interoperability would empower recipients to choose a service based on its functionality and features such as security, privacy, and data processing standards, rather than its existing user base.
2021/07/19
Committee: JURI
Amendment 302 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent mannerwith the appropriate safeguards against over-removal of legal content. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.
2021/07/08
Committee: IMCO
Amendment 343 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market, and to ensure a safe and transparent online environment and a high level of consumer protection, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety, security and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 345 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as health – including mental health, the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 360 #
Proposal for a regulation
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. In addition, recipients of intermediary services should be able to hold the legal representative liable for non-compliance.
2021/07/08
Committee: IMCO
Amendment 415 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations, consumer organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/08
Committee: IMCO
Amendment 438 #
Proposal for a regulation
Article 6 – paragraph 1 c (new)
Providers of intermediary services shall ensure that such measures are accompanied by appropriate safeguards, such as human oversight, documentation, traceability, transparency of algorithms used or additional measures to ensure the accuracy, fairness, transparency and non- discrimination of voluntary own-initiative investigations.
2021/07/19
Committee: JURI
Amendment 441 #
Proposal for a regulation
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. Providers of intermediary services shall not be obliged to use automated tools for content moderation.
2021/07/19
Committee: JURI
Amendment 447 #
Proposal for a regulation
Recital 50 a (new)
(50a) After having obtained the necessary contact information of a trader, which are aimed at ensuring consumer rights, a provider of intermediary services needs to verify that these details are consistently being updated and accessible for consumers. Therefore, it shall conduct regular and randomized checks on the information provided by the traders on its platform. To ensure a consistent display of these contact information, intermediary services should establish mandatory designs for the inclusion of these contact information. A content, good or service shall only be displayed after all necessary information are made available by the business user.
2021/07/08
Committee: IMCO
Amendment 452 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 1
— a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed with due regard to fundamental rights of the recipient of the service concerned;
2021/07/19
Committee: JURI
Amendment 481 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including unsafe, counterfeit or non-compliant products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/07/08
Committee: IMCO
Amendment 508 #
Proposal for a regulation
Recital 65 a (new)
(65a) Due to their market position, very large online platforms have developed an increasing influence over society’s social, economic, and political interactions. Consumers face a lock-in situation, which may lead them into accepting unfavourable terms and conditions to participate in the services provided by these very large online platforms. To restore a competitive market and to allow consumers more choices, very large online platforms should be required to setup the necessary technical access points to create interoperability for their core services, with a view to allowing competitors a fairer market access and enabling more choice for consumers, while at the same time complying with privacy, security and safety standards. These access points should create interoperability for other online platform services of the same type, without the need to convert digital content or services to ensure functionality.
2021/07/08
Committee: IMCO
Amendment 534 #
Proposal for a regulation
Recital 73
(73) Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be identified as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure regular reporting and effective involvement of all relevant authorities in the supervision and enforcement at Union level.
2021/07/08
Committee: IMCO
Amendment 547 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Providers of intermediary services shall provide recipients of services with a concise and easily readable summary of the terms and conditions. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies available.
2021/07/19
Committee: JURI
Amendment 606 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including a high level of consumer protection, are effectively protected.
2021/07/08
Committee: IMCO
Amendment 636 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU)2020/1828;
2021/07/08
Committee: IMCO
Amendment 637 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
4a. This article shall not apply when a recipient of the service decides to remove or disable access to specific items of information provided by other recipients of the service.
2021/07/19
Committee: JURI
Amendment 639 #
Proposal for a regulation
Article 15 a (new)
Article 15a Content moderation 1. Providers of hosting services shall not use ex-ante control measures for content moderation based on automated tools or ex-ante filtering of content. Where providers of hosting services use automated tools for content moderation, they shall ensure qualified human oversight for any action taken and that legal content which does not infringe the terms and conditions set out by the provider is not affected. This paragraph shall not apply to moderating information which has most likely been provided by automated tools. 2. Providers of hosting services shall act in a fair, transparent, coherent, predictable, non-discriminatory, diligent, non-arbitrary and proportionate manner when moderating content, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service. Content moderation practices shall be proportionate to the type and volume of content, relevant and limited to what is necessary for the purposes for which the content is moderated. 3. Providers of hosting services shall not subject recipients of the service to discriminatory practices, exploitation or exclusion for the purposes of content moderation, such as removal of user- generated content based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social class.
2021/07/19
Committee: JURI
Amendment 640 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU) 2019/882
2021/07/08
Committee: IMCO
Amendment 646 #
Proposal for a regulation
Article 1 a (new)
Article 1a Objective The aim of this Regulation is to contribute to the proper functioning of the internal market by setting out harmonised rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/07/08
Committee: IMCO
Amendment 666 #
Proposal for a regulation
Article 2 – paragraph 1 – point d a (new)
(da) ‘child’ means any natural person under the age of 18;
2021/07/08
Committee: IMCO
Amendment 677 #
Proposal for a regulation
Article 17 – paragraph 5 a (new)
5a. Online platforms shall ensure that any relevant information in relation to decisions taken by the internal complaint- handling mechanism is available to recipients of the service for the purpose of seeking redress through an out-of-court dispute settlement body pursuant to Article 18 or before a court.
2021/07/19
Committee: JURI
Amendment 697 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1
The Digital Services Coordinator shall, where applicable, specify in the certificate the particular issues to which the body’s expertise relates and the official language or languages of the Union in which the body is capable of settling disputes, as referred to in points (b) and (d) of the first subparagraph, respectively. Certified out-of-court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time.
2021/07/19
Committee: JURI
Amendment 701 #
Proposal for a regulation
Article 18 – paragraph 3 – subparagraph 2
Certified out-of-court dispute settlement bodies shall make information on the fees, or the mechanisms used to determine the fees, known to the recipient of the services and the online platform concerned before engaging in the dispute settlementpublicly available.
2021/07/19
Committee: JURI
Amendment 721 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests or is an individual rightholder and is independent from any online platform;
2021/07/19
Committee: JURI
Amendment 724 #
Proposal for a regulation
Article 19 – paragraph 2 – point c
(c) it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner and in full respect of fundamental rights such as the freedom of expression and information.
2021/07/19
Committee: JURI
Amendment 765 #
Proposal for a regulation
Article 5 – paragraph 2
2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority, decisive influence or the control of the provider.
2021/07/08
Committee: IMCO
Amendment 777 #
Proposal for a regulation
Article 5 a (new)
Article 5a Liability of online platform allowing consumers to conclude distance contracts with traders 1. In addition to Article 5(1), an online platform allowing consumers to conclude distance contracts with traders shall not benefit from the liability exemption provided for in Article 5 if it does not comply with the obligations referred to in Articles 11, 13b, 13c, 14, 22 or 24a. Such liability exemption shall also not benefit the online platform if it does not comply with specific information requirements for contracts concluded on online marketplaces, in line with Article 6a(1) of the Directive 2011/83/EU of the European Parliament and of the Council. 2. The liability exemption in Article 5(1) and in paragraph 1 of this Article shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead a consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its control, authority or decisive influence. 3. For the assessment of whether the online platform has that control or authority or decisive influence over the trader, relevant criteria shall include, among others: (a) the trader-consumer contract is concluded exclusively through facilities provided on the platform; (b) the online platform operator withholds the identity of the trader or contact details until after the conclusion of the trader-consumer contract; (c) the online platform operator exclusively uses payment systems which enable the platform operator to withhold payments made by the consumer to the trader; (d) the terms of the trader-consumer contract are essentially determined by the online platform operator; (e) the price to be paid by the consumer is set by the online platform operator; or (f) the online platform is marketing the product or service in its own name rather than using the name of the trader who will supply it; 4. The liability exemption in Article 5(1) of this Regulation shall not apply in case an online platform allows consumers to conclude distance contracts with traders from third countries when: (a) there is no economic operator inside the Union liable for the product safety or when the economic operator is available but does not respond to claims or take measures to remedy the harm; and (b) the product does not comply with the relevant Union or national law; 5. Consumers concluding distance contracts with traders shall be entitled to seek redress from the online platform for infringement of the obligations laid down in this Regulation and in accordance with relevant Union and national law. 6. The online platform shall be entitled to seek redress from the trader who has used its services in case of a failure by that trader to comply with his obligations under this Regulation regarding the online platform or regarding the consumers.
2021/07/08
Committee: IMCO
Amendment 866 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to, including the respect for private and family life, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
2021/07/19
Committee: JURI
Amendment 928 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in theiruse fair, non-discriminatory and transparent contract terms and conditions. T that information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set outshall be drafted in clear and unambiguous language and shall bare publicly available in an easily accessible format in a searchable archive of all the previous versions with their date of application.
2021/07/08
Committee: IMCO
Amendment 937 #
Proposal for a regulation
Article 12 – paragraph 1 a (new)
1a. Providers of intermediary services shall ensure their terms and conditions are age-appropriate and meet the highest European or International standards, pursuant to Article 34.
2021/07/08
Committee: IMCO
Amendment 951 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and the natural or legal person who finances the advertisement;
2021/07/19
Committee: JURI
Amendment 956 #
(ea) any decisions by the online platform regarding labelling, removal or disabling of online advertisements, including a justification explaining the grounds for the decision.
2021/07/19
Committee: JURI
Amendment 973 #
Proposal for a regulation
Article 12 b (new)
Article 12b Mitigation of risks to children Providers of intermediary services likely to impact children shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 13 (12 a new). Such measures shall include, where applicable: (a) implementing mitigation measures identified in Article 27 with regard for children’s best interests; (b) adapting or removing system design features that expose children to content, contact, conduct and contract risks, as identified in the process of conducting child impact assessments; (c) implementing proportionate and privacy preserving age assurance, meeting the standard outlined in Article 34; (d) adapting content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions to ensure they prioritise the best interests of the child; (e) ensuring the highest levels of privacy, safety, and security by design and default for users under the age of 18; (f) preventing profiling, including for commercial purposes like targeted advertising; (g) ensuring published terms are age appropriate and uphold children’s rights; (h) providing child-friendly mechanisms for remedy and redress, including easy access to expert advice and support;
2021/07/08
Committee: IMCO
Amendment 975 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests or civil society organisations representing the public interest, be independent from commercial interests, disclose the sources of funding financing their research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
2021/07/19
Committee: JURI
Amendment 977 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this Regulation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/07/19
Committee: JURI
Amendment 1006 #
Proposal for a regulation
Article 33 a (new)
Article 33a Interoperability 1. Very large online platforms shall offer interoperability of industry-standard features of their services to other online platforms by creating easily accessible application programming interfaces. 2. Very large online platforms may only temporarily limit access to interoperability features in case of provable abuse by a third-party provider or when justified by an immediate requirement to address a technical issue such as a serious security vulnerability. 3. In accordance with Union legislation on standardisation, the Commission shall request European standardisation bodies to develop the necessary technical standards for interoperability such as protocol interoperability and data interoperability and portability. 4. The Commission shall be empowered to review the implementation of these obligations by very large online platforms, adopt implementing measures specifying the nature and scope of the obligations, and provide updateable definitions of industry-standard features where necessary. 5. This Article is without prejudice to any limitations and restrictions set out in Regulation (EU) 2016/679.
2021/07/19
Committee: JURI
Amendment 1015 #
Proposal for a regulation
Article 13 a (new)
Article 13a Display of the identity of business users 1. A provider of intermediary services shall ensure that the identity of the business user providing content, goods or services is clearly visible alongside the content, goods or services offered. 2. For this purpose, a provider of intermediary services shall establish a standardized and mandatory interface for business users. A content, good or service shall only be displayed to users, if the necessary contact information is made available. 3. A provider of intermediary services shall on a regular basis conduct checks on the information provided by a business user in accordance with paragraph (2).
2021/07/08
Committee: IMCO
Amendment 1078 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Where an online platform that allows consumers to conclude distance contracts with traders, detects and identifies illegal goods or services, it shall be obliged to establish an internal database of those goods and services that had previously been taken down by the online platform because they had been found to be illegal or harmful. They shall, under the inclusion of elements listed in the Rapid Exchange of Information System (RAPEX) and other relevant public databases, scan their database on a daily basis to detect illegal goods and services. If this process detects a good or service that has previously been found to be illegal or harmful, the online platform shall be obliged to delete the content expeditiously.
2021/07/08
Committee: IMCO
Amendment 1132 #
Proposal for a regulation
Article 15 a (new)
Article 15a Online interface design and organisation 1. Providers of hosting services shall not distort or impair consumers’ ability to make an informed decision via the structure, function or manner of operation of their online interface or a part thereof. 2. Providers of hosting services shall design and organise their online interface in a way that enables themselves and traders to comply with their obligations under applicable Union and Member State law on consumer protection, including on product safety.
2021/07/08
Committee: IMCO
Amendment 1145 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the provider of the online platform not to act upon the receipt of a notice or against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/07/08
Committee: IMCO
Amendment 1152 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions whether or not to remove or disable access to or restrict visibility of the information;
2021/07/08
Committee: IMCO
Amendment 1159 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
2021/07/08
Committee: IMCO
Amendment 1163 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) decisions whether or not to suspend or terminate the recipients’ account.
2021/07/08
Committee: IMCO
Amendment 1178 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling and redress systems are easy to access, and user-friendly, including for children, and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.
2021/07/08
Committee: IMCO
Amendment 1200 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the bodyalways direct recipients to an out-of-court dispute settlement body. The information about the competent out-of-court body shall be easily accessible on the online interface of the online platform in a clear and an user-friendly manner.
2021/07/08
Committee: IMCO
Amendment 1208 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
1a. Online platforms shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
2021/07/08
Committee: IMCO
Amendment 1243 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
2a. Certified out-of-court dispute settlement bodies shall draw up annual reports listing the number of complaints received annually, the outcomes of the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes.
2021/07/08
Committee: IMCO
Amendment 1276 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it is an individual rightholder or represents collective interests and is independent from any online platform;
2021/07/08
Committee: IMCO
Amendment 1362 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
2a. When a platform that allows consumers to conclude distance contracts with traders becomes aware that a piece of information, a product or service poses a serious risk to the life, health or safety of consumers, it shall promptly inform the competent authorities of the Member State or Member States concerned and provide all relevant information available.
2021/07/08
Committee: IMCO
Amendment 1712 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, functionable and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until onfive year after the advertisement was displayed for the last time on their online interfaces. They shall ensure multi- criterion queries can be performed per advertiser and per all data points present in the advertisement, and provide aggregated data for these queries on the amount spent, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/08
Committee: IMCO
Amendment 1759 #
Proposal for a regulation
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate., and with an easily accessible and user-friendly mechanism to search for multiple criteria, such as those reported in accordance with the obligations set out in Articles 13 and 23
2021/07/08
Committee: IMCO
Amendment 1771 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/07/08
Committee: IMCO
Amendment 1804 #
Proposal for a regulation
Article 33 a (new)
Article 33a Algorithm accountability 1. When using automated decision- making, the very large online platform shall perform an assessment of the algorithms used. 2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: (a) the compliance with corresponding Union requirements; (b) how the algorithm is used and its impact on the provision of the service; (c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and (d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). 3. When performing its assessment, the very large online platform may seek advice from relevant national public authorities, researchers and non- governmental organisations. 4. Following the assessment, referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points a), b), c) and d) of Paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. 5. Where the very large online platform finds that the algorithm used does not comply with point (a), or (d) of paragraph 2 of this Article, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. 6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c), or (d) of paragraph 2 of this Article, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred into Paragraph 5 of this Article, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement.
2021/07/08
Committee: IMCO
Amendment 1809 #
Proposal for a regulation
Article 33 a (new)
Article 33a Interoperability 1. Very large online platforms shall provide, by creating and offering an application programming interface, options enabling the interoperability of their core services to other online platforms. 2. Application programming interfaces should be easy to use, while the processing of personal data shall only be possible in a manner that ensures appropriate security of these data. Measures under paragraph (1) may not limit, hinder or delay the ability of content hosting platforms to fix security issues, nor should the need to fix security issues lead to an undue delay for the provision on interoperability. 3. This Article is without prejudice to any limitations and restrictions set out in Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1914 #
Proposal for a regulation
Article 38 – paragraph 3 – subparagraph 2
Member States shall make publicly available through online and offline means, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted.
2021/07/08
Committee: IMCO