BETA

104 Amendments of Marco ZANNI related to 2020/0361(COD)

Amendment 136 #
Proposal for a regulation
Recital 3
(3) Responsible and diligent behaviour by providers of intermediary services is essential for a safe, accessible, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non- discrimination.
2021/09/10
Committee: ECON
Amendment 142 #
Proposal for a regulation
Recital 5
(5) This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council26 , that is, any service normalfrequently provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activitiesillegal content. _________________ 26Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
2021/09/10
Committee: ECON
Amendment 147 #
Proposal for a regulation
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. _________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
2021/09/10
Committee: ECON
Amendment 162 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, cyberbullying, man in the middle (MITM) attacks, phishing, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union lawthe criminal, administrative or civil legal framework of a Member State and what the precise nature or subject matter is of the law in question.
2021/09/10
Committee: ECON
Amendment 171 #
Proposal for a regulation
Recital 15 a (new)
(15a) The general collection of personal data concerning the use of digital services interferes disproportionately with the right to privacy in the digital age. In line with the principle of data minimisation and in order to prevent unauthorised disclosure, identity theft and other forms of abuse of personal data, recipients should have the possibility to access information society services and the right to use and pay for information society services anonymously wherever technically possible. Similarly, users have a right not to be subject to tracking when using information society services. To that end, the processing of personal data concerning the use of digital services should be limited to the extent strictly necessary to provide the service and to bill the users. In addition, the collection of personal data for the purposes of retention, sale or user profiling should be prohibited.
2021/09/10
Committee: ECON
Amendment 177 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content act to remove or to disable access to the illegal content when such content is deemed to be illegal according to Union or Member State law. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness throughf illegal content, in particular, its own- initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/09/10
Committee: ECON
Amendment 178 #
Proposal for a regulation
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
2021/09/10
Committee: ECON
Amendment 180 #
Proposal for a regulation
Recital 23 a (new)
(23a) European consumers should be able to safely purchase products and services online, regardless of whether a product or service has been produced in the Union or not. Online platforms allowing distance contracts with third- country traders should establish that before approving that trader on their platform, the third-country trader complies with the relevant Union or national law on product safety and product compliance. In addition, if the third-country trader does not provide an economic operator inside the Union liable for the product safety, online platforms should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation.
2021/09/10
Committee: ECON
Amendment 189 #
Proposal for a regulation
Recital 28
(28) Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature, neither should they use automated moderation. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content.
2021/09/10
Committee: ECON
Amendment 203 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities, which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
2021/09/10
Committee: ECON
Amendment 207 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. To this end, the use of algorithmic decision- making processes should be disclosed to users whenever they are employed.
2021/09/10
Committee: ECON
Amendment 214 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/09/10
Committee: ECON
Amendment 218 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53deleted
2021/09/10
Committee: ECON
Amendment 222 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/09/10
Committee: ECON
Amendment 228 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delaypromptly inform the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. _________________ 44Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2021/09/10
Committee: ECON
Amendment 231 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 , or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the oOnline platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guamay also ask for support from the Digital Services Coordinator in carrying out these specific obligations. If the trader is established outside the Union and should not cooperate or not to provide sufficient information for the verification of its compliance with the relevant Union or Member State law, this trader should not be admitted to operate and sell its products on the platform. If the trader is already on the platform and should not meet the above criteria, the platform should suspend that trader's account. The trader should be granteeingd the reliapossibility of the information towards consumer or other interested parties. Such oredress in the event of suspension of the business account. Online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . _________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/09/10
Committee: ECON
Amendment 254 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/09/10
Committee: ECON
Amendment 269 #
Proposal for a regulation
Recital 67
(67) The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.
2021/09/10
Committee: ECON
Amendment 273 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content e.g. sharing of images depicting child sexual abuse or terrorist content, should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
2021/09/10
Committee: ECON
Amendment 275 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan.
2021/09/10
Committee: ECON
Amendment 290 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an careful assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/09/10
Committee: ECON
Amendment 298 #
Proposal for a regulation
Recital 98
(98) In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the Commission should have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties.
2021/09/10
Committee: ECON
Amendment 299 #
Proposal for a regulation
Recital 99
(99) In particular, the Commission should have access to any relevant documents, data and information necessaryThe Commission to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers
2021/09/10
Committee: ECON
Amendment 308 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/09/10
Committee: ECON
Amendment 312 #
Proposal for a regulation
Article 1 – paragraph 5 – point b a (new)
(ba) Directive (EU) 2019/882;
2021/09/10
Committee: ECON
Amendment 318 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – introductory part
(d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union; in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as:
2021/09/10
Committee: ECON
Amendment 320 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
— a significant number of users in one or more Member States; ordeleted
2021/09/10
Committee: ECON
Amendment 326 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an or activity, including the sale of products or provision of services, which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that lawcriminal, administrative or civil legal framework of a Member State;
2021/09/10
Committee: ECON
Amendment 338 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) 'persons with disabilities' means person within the meaning of Article 3(1) of Directive(EU) 2019/882;
2021/09/10
Committee: ECON
Amendment 341 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously, act to remove or to disable access to the illegal content if the content or activity is to be deemed illegal under Article 2(g).
2021/09/10
Committee: ECON
Amendment 345 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. In addition, the liability exemption in paragraph 1 shall not apply in case an online platform allows consumers to conclude distance contracts with third-country traders when there is no economic operator inside the Union liable for the product safety on behalf of that trader.
2021/09/10
Committee: ECON
Amendment 360 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means and by telephone, with Member States’ authorities, the Commission and the Board referred to in Article 47 for the application of this Regulation.
2021/09/10
Committee: ECON
Amendment 362 #
Proposal for a regulation
Article 10 – paragraph 2
2. Providers of intermediary services shall make public, in a clear and user- friendly manner, the information necessary to easily identify and communicate with their single points of contact.
2021/09/10
Committee: ECON
Amendment 369 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. The use of algorithmic decision-making processes shall be notified to users whenever they are applied. The users should be able, where appropriate, to switch easily from interaction with the algorithmic system to human interaction. The information shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. Providers of intermediary services should list the restrictions in relation to the use of their service for the dissemination of content deemed illegal under Union or Member State law in a clear and user-friendly manner, and differentiate the list from the general conditions for the use of their service so as to make the user aware of what is deemed illegal under the law and what is subject to the terms and conditions for the use of the service.
2021/09/10
Committee: ECON
Amendment 374 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a transparent, diligent, objective, non-arbitrary and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter.
2021/09/10
Committee: ECON
Amendment 379 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, measures and tools used for the purpose of content moderation, including the impact of algorithmic decision-making compared to human review, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/09/10
Committee: ECON
Amendment 383 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means. Any moderation system adopted either by providers of hosting services or by platforms shall not be applied to elected individuals to whom shall be recognized immunity because of their institutional role.
2021/09/10
Committee: ECON
Amendment 387 #
Proposal for a regulation
Article 14 – paragraph 2 – introductory part
2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligentn economic operator can identify the illegality of the content in quesestablish, in a diligent manner and without discrimination, whether the notice concerns illegal content as defined in Article 2(g) of this Regulation. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements:
2021/09/10
Committee: ECON
Amendment 390 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content. The possibility of identifying, on the basis of a list drawn up in agreement with the Digital Service Coordinator, the type of illegal content to which the individual or entity presumes the reported content below, to should also be foreseen;
2021/09/10
Committee: ECON
Amendment 397 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.deleted
2021/09/10
Committee: ECON
Amendment 401 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent, fair and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
2021/09/10
Committee: ECON
Amendment 404 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access and the duration;
2021/09/10
Committee: ECON
Amendment 405 #
Proposal for a regulation
Article 15 – paragraph 2 – point c
(c) where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means;
2021/09/10
Committee: ECON
Amendment 417 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective and user-friendly internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/09/10
Committee: ECON
Amendment 419 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove, restrict, modify or disable access to the information;
2021/09/10
Committee: ECON
Amendment 422 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Online platforms shall set out the rules of procedure of their internal complaint handling system in a clear and user-friendly manner. The complainant should be able to enter free written explanations in addition to the pre- established complaint options.
2021/09/10
Committee: ECON
Amendment 425 #
Proposal for a regulation
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objectiveobjective and transparent manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
2021/09/10
Committee: ECON
Amendment 428 #
Proposal for a regulation
Article 17 – paragraph 4
4. Online platforms shall promptly inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities.
2021/09/10
Committee: ECON
Amendment 431 #
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
2021/09/10
Committee: ECON
Amendment 432 #
Proposal for a regulation
Article 18 – paragraph 2 – point d
(d) it is capable of settling dispute in a swift, efficient, accessible for persons with disabilities, and cost-effective manner and in at least one official language of the Union and at least in the language of the recipient to whom the decision referred to in Article 17 is addressed;
2021/09/10
Committee: ECON
Amendment 433 #
Proposal for a regulation
Article 18 – paragraph 2 – point e
(e) the dispute settlement takes place in accordance with clear and fair, fair and transparent rules of procedure.
2021/09/10
Committee: ECON
Amendment 434 #
Proposal for a regulation
Article 18 – paragraph 3 – introductory part
3. If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement.
2021/09/10
Committee: ECON
Amendment 437 #
Proposal for a regulation
Article 19
[...]deleted
2021/09/10
Committee: ECON
Amendment 448 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platforms shall suspend, for a reasonablespecified period of time and, after having issued a prior warning and provided a comprehensive explanation, the provision of their services to recipients of the service that frequently provide manifestly illegal content. The online platform may request support from the Digital Service Coordinator to establish the frequency for which account suspension is deemed necessary and to set the duration of the suspension.
2021/09/10
Committee: ECON
Amendment 452 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platforms shall suspend, for a reasonablespecified period of time and after having issued at least three prior warnings, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
2021/09/10
Committee: ECON
Amendment 453 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year;
2021/09/10
Committee: ECON
Amendment 455 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
3a. The assessment must be carried out by qualified staff provided with dedicated training on the applicable legal framework.
2021/09/10
Committee: ECON
Amendment 460 #
Proposal for a regulation
Article 21 – paragraph 2 – introductory part
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform without undue delay the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol.
2021/09/10
Committee: ECON
Amendment 463 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
2021/09/10
Committee: ECON
Amendment 465 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bank account details of the trader, where the trader is a natural person;
2021/09/10
Committee: ECON
Amendment 467 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
(f) a self-certification by the trader committing to only offerthat products or services thatprovided comply with the applicable rules of Union lawrelevant Union or national law on product safety and product compliance.
2021/09/10
Committee: ECON
Amendment 469 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assessassess, with the support of the Digital Service Coordinator if needed, whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sourcesand official sources. Online platforms allowing distance contracts with third-country traders should establish that the third-country trader complies with the relevant Union or national law on product safety and product compliance before giving them access its services offered in the Union and, where appropriate, with the support of the Digital service Coordinator. The Digital Service Coordinator may request support from market surveillance or customs authorities to assess the information provided by the trader.
2021/09/10
Committee: ECON
Amendment 472 #
Proposal for a regulation
Article 22 – paragraph 4
4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently, asking the trader to notify any changes and confirm the information held by the online platform once a year. After the contractual relationship has ended, the online platform shall delete the information.
2021/09/10
Committee: ECON
Amendment 474 #
Proposal for a regulation
Article 23 – paragraph 1 – point b
(b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints;
2021/09/10
Committee: ECON
Amendment 475 #
Proposal for a regulation
Article 23 – paragraph 2
2. Online platforms shall publish, at least once every six months, information on the average monthly active recipients of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
2021/09/10
Committee: ECON
Amendment 487 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
Special attention shall be given to recipients of the service who are minors. When advertising is addressed to minors, online platforms shall indicate in a clear, easy and unambiguous manner that such advertising targets this group of recipients.
2021/09/10
Committee: ECON
Amendment 499 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination, including through algorithmic biases, and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively;
2021/09/10
Committee: ECON
Amendment 504 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/09/10
Committee: ECON
Amendment 506 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/09/10
Committee: ECON
Amendment 507 #
Proposal for a regulation
Article 27 – paragraph 1 – point a
(a) adaptchecking content moderation or recommender systems, their decision- making processes, the features or functioning of their services, or their terms and conditions;
2021/09/10
Committee: ECON
Amendment 510 #
(d) initiating or adjusting cooperation with trusted flaggers in accordance with Article 19;deleted
2021/09/10
Committee: ECON
Amendment 511 #
Proposal for a regulation
Article 27 – paragraph 2 – introductory part
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which shall include the following:
2021/09/10
Committee: ECON
Amendment 513 #
Proposal for a regulation
Article 27 – paragraph 2 – point b
(b) best practices for very large online platforms to mitigate the systemic risks identifideleted.
2021/09/10
Committee: ECON
Amendment 516 #
Proposal for a regulation
Article 27 – paragraph 3
3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.deleted
2021/09/10
Committee: ECON
Amendment 519 #
Proposal for a regulation
Article 28 – paragraph 1 – introductory part
1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:
2021/09/10
Committee: ECON
Amendment 522 #
Proposal for a regulation
Article 28 – paragraph 2 – point b
(b) have proven expertise in the area of risk management, technical competence and capabilities certified by qualified and accredited certification body; ;
2021/09/10
Committee: ECON
Amendment 527 #
Proposal for a regulation
Article 28 – paragraph 4
4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non-compliance identified.
2021/09/10
Committee: ECON
Amendment 539 #
Proposal for a regulation
Article 30 – paragraph 2 – point d
(d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose;deleted
2021/09/10
Committee: ECON
Amendment 549 #
Proposal for a regulation
Article 34 – paragraph 1 – point b
(b) electronic submission of notices by trusted flaggers under Article 19, including through application programming interfaces;deleted
2021/09/10
Committee: ECON
Amendment 552 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission, in agreement with the Board, may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/09/10
Committee: ECON
Amendment 559 #
Proposal for a regulation
Article 39 – paragraph 1
1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks. Similarly Member States shall ensure that Digital Services Coordinators have all the competences and powers provided for by the current national laws.
2021/09/10
Committee: ECON
Amendment 562 #
Proposal for a regulation
Article 40 – paragraph 3
3. Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of Chapters III and IV. Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member States andto ensure that the principle of ne bis in idem is respected.
2021/09/10
Committee: ECON
Amendment 563 #
Proposal for a regulation
Article 41 – paragraph 1 – point b
(b) the power to carry out on-site inspections of any premises that those providers or those persons use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement in any form, irrespective of the storage medium;deleted
2021/09/10
Committee: ECON
Amendment 564 #
Proposal for a regulation
Article 41 – paragraph 1 – point c
(c) the power to ask any member of staff or representative of those providers or those persons to give explanations in respect of any information relating to a suspected infringement and to record the answers.deleted
2021/09/10
Committee: ECON
Amendment 565 #
Proposal for a regulation
Article 41 – paragraph 2 – point b
(b) the power to order the cessation of infringements and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end;deleted
2021/09/10
Committee: ECON
Amendment 566 #
Proposal for a regulation
Article 41 – paragraph 2 – point e
(e) the power to adopt interim measures to avoid the risk of serious harm.deleted
2021/09/10
Committee: ECON
Amendment 567 #
Proposal for a regulation
Article 41 – paragraph 3 – point a
(a) require the management body of the providers, within a reasonable time period, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken within a specific period;
2021/09/10
Committee: ECON
Amendment 568 #
Proposal for a regulation
Article 41 – paragraph 3 – point b
(b) where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place.
2021/09/10
Committee: ECON
Amendment 577 #
Proposal for a regulation
Article 48 – paragraph 1
1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. The meeting is deemed valid when at least two third of the eligible members are present.
2021/09/10
Committee: ECON
Amendment 578 #
Proposal for a regulation
Article 48 – paragraph 2 – subparagraph 1
The Board shall adopt its acts by simple majority. In the event of a tied vote, the vote shall be considered void and a new vote shall be held by the Board.
2021/09/10
Committee: ECON
Amendment 585 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period.
2021/09/10
Committee: ECON
Amendment 587 #
Proposal for a regulation
Article 50 – paragraph 2
2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may includerecommend, where appropriate, participation in a code of conduct as provided for in Article 35.
2021/09/10
Committee: ECON
Amendment 591 #
Proposal for a regulation
Article 51 – paragraph 1 – introductory part
1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that:
2021/09/10
Committee: ECON
Amendment 592 #
Proposal for a regulation
Article 51 – paragraph 2 – introductory part
2. Where the Commission decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. If the Commission decides not to initiate proceedings pursuant to paragraph 1, it shall inform the Board in writing of its reasons.
2021/09/10
Committee: ECON
Amendment 597 #
Proposal for a regulation
Article 54
Power to conduct on-site inspections 1. In order to carry out the tasks assigned to it under this Section, the Commission may conduct on-site inspections at the premises of the very large online platform concerned or other person referred to in Article 52(1). 2. On-site inspections may also be carried out with the assistance of auditors or experts appointed by the Commission pursuant to Article 57(2). 3. During on-site inspections the Commission and auditors or experts appointed by it may require the very large online platform concerned or other person referred to in Article 52(1) to provide explanations on its organisation, functioning, IT system, algorithms, data- handling and business conducts. The Commission and auditors or experts appointed by it may address questions to key personnel of the very large online platform concerned or other person referred to in Article 52(1). 4. The very large online platform concerned or other person referred to in Article 52(1) is required to submit to an on-site inspection ordered by decision of the Commission. The decision shall specify the subject matter and purpose of the visit, set the date on which it is to begin and indicate the penalties provided for in Articles 59 and 60 and the right to have the decision reviewed by the Court of Justice of the European Union.Article 54 deleted
2021/09/10
Committee: ECON
Amendment 602 #
Proposal for a regulation
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decisionafter consulting the Board, order interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement.
2021/09/10
Committee: ECON
Amendment 605 #
Proposal for a regulation
Article 56 – paragraph 1
1. If, during proceedings under this Section, the very large online platform concerned offers commitments to ensure compliance with the relevant provisions of this Regulation, the Commission may by decision, after consulting the Board, make those commitments binding on the very large online platform concerned and declare that there are no further grounds for action.
2021/09/10
Committee: ECON
Amendment 606 #
Proposal for a regulation
Article 56 – paragraph 3
3. Where the Commission considers that the commitments offered by the very large online platform concerned are unable to ensure effective compliance with the relevant provisions of this Regulation, it shall reject those commitments in a reasoned decision, in agreement with the Board, when concluding the proceedings.
2021/09/10
Committee: ECON
Amendment 610 #
Proposal for a regulation
Article 58 – paragraph 1 – introductory part
1. The Commission shall adopt a non- compliance decision, after consulting the Board, where it finds that the very large online platform concerned does not comply with one or more of the following:
2021/09/10
Committee: ECON
Amendment 616 #
Proposal for a regulation
Article 58 – paragraph 5
5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision approved by the Board.
2021/09/10
Committee: ECON
Amendment 622 #
Proposal for a regulation
Article 59 – paragraph 2 – point c
(c) refuse to submit to an on-site inspection pursuant to Article 54.deleted
2021/09/10
Committee: ECON
Amendment 623 #
Proposal for a regulation
Article 59 – paragraph 3
3. Before adopting the decision pursuant to paragraph 2, the Commission shall communicate its preliminary findings to the very large online platform concerned or other person referred to in Article 52(1) and to the Board.
2021/09/10
Committee: ECON
Amendment 626 #
Proposal for a regulation
Article 60 – paragraph 1 – point b
(b) submit to an on-site inspection which it has ordered by decision pursuant to Article 54;deleted
2021/09/10
Committee: ECON
Amendment 631 #
Proposal for a regulation
Article 61 – paragraph 3 – point b
(b) on-site inspection;deleted
2021/09/10
Committee: ECON