BETA

Activities of Ivan ŠTEFANEC related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (157)

Amendment 186 #
Proposal for a regulation
Recital 2 a (new)
(2a) Moreover, complex national regulatory requirements, fragmented implementation and insufficient enforcement of legislation such as Directive 2000/31/EC have contributed to high administrative costs and legal uncertainty for intermediary services operating on the internal market, especially micro, small and medium sized companies.
2021/07/08
Committee: IMCO
Amendment 208 #
Proposal for a regulation
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targedirecting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targedirecting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. __________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
2021/07/08
Committee: IMCO
Amendment 213 #
Proposal for a regulation
Recital 9
(9) This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provisionfully harmonises the rules applicable to intermediary services in the internal market with the objective to ensure a safe and trusted online environment, effective protection of fundamental rights and a favourable business climate. Accordingly, Member States should not adopt or maintain additional national requirements on those matters falling within the scope of this Regulation. This does not preclude the possibility to apply other national legislation applicable to providers of intermediary services, in particular Directive 2000/31/ECaccordance with Union law, including Directive 2000/31/EC, in particular its Article 3, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended,28 and Regulation (EU) …/.. of the European Parliament and of the Council29 – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. . __________________ 28 Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (Text with EEA relevance), OJ L 95, 15.4.2010, p. 1 . 29Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation
2021/07/08
Committee: IMCO
Amendment 224 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept, including fake online profile accounts. Illegal content is often spread online precisely via fake online profile accounts. Namely, false representation in the ‘online world’ should not be legal as it is also not legal to falsely present oneself in the ‘offline world’. This approach is an evident manifestation of the principle that what is illegal offline should not be allowed to remain legal online. Moreover, the concept of “illegal content” should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/08
Committee: IMCO
Amendment 226 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that conceptFor the purpose of this Regulation the concept of “illegal content” should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relateit is not in compliance with Union law as it refers to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/07/08
Committee: IMCO
Amendment 249 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Services, such as internet infrastructure services or cloud service providers, which are provided at the request of parties other than the content providers and only indirectly benefitting the latter, should not be covered by the definition of online platforms. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/08
Committee: IMCO
Amendment 257 #
Proposal for a regulation
Recital 17
(17) The relevant rules of Chapter II should only establish when the provider of intermediary services concerned cannot be held liable in relation to illegal content provided by the recipients of the service. Those rules should notby no means be understood to provide a positive basis for establishing when a provider can be held liable, which is for the applicable rules of Union or national law to determine. Furthermore, the exemptions from liability established in this Regulation should apply in respect of any type of liability as regards any type of illegal content, irrespective of the precise subject matter or nature of those laws.
2021/07/08
Committee: IMCO
Amendment 278 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expedwitihouslt undue delay to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/07/08
Committee: IMCO
Amendment 281 #
Proposal for a regulation
Recital 22 a (new)
(22a) The exemption of liability should not apply where the recipient of the service is acting under the authority or the control of the provider of a hosting service. In particular, where the provider of the online platform that allows consumers to conclude distance contracts with traders does not allow traders to determine the basic elements of the trader-consumer contract, such as the terms and conditions governing such relationship or the price, it should be considered that the trader acts under the authority or control of that platform.
2021/07/08
Committee: IMCO
Amendment 282 #
Proposal for a regulation
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders as a functionality of their service, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. This is the case where the online platform operator fails to clearly display the identity of the trader following this Regulation. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. In particular, it is relevant whether the online platform operator withholds such identity or contract details until after the conclusion of the trader- consumer contract, or is marketing the product or service in its own name rather than using the name of the trader who will supply it.
2021/07/08
Committee: IMCO
Amendment 291 #
Proposal for a regulation
Recital 23 a (new)
(23a) Consumers should be able to safely purchase products and services online, irrespective of whether a product or service has been produced in the Union. For that reason, traders from third countries should establish a legal representative in the Union to whom claims regarding product safety could be addressed. Providers of intermediary services from inside the Union as well as from third countries should ensure compliance with product requirements set out in Union law.
2021/07/08
Committee: IMCO
Amendment 310 #
Proposal for a regulation
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, cloud infrastructure services or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
2021/07/08
Committee: IMCO
Amendment 331 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. Since intermediaries should not be required to remove information which is legal in their country of establishment, national and Union authorities should be able to order the blocking of content legally published outside the Union only for the territory of the Union where Union law is infringed and for the territory of the issuing Member State where national law is infringed.
2021/07/08
Committee: IMCO
Amendment 346 #
Proposal for a regulation
Recital 34
(34) In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should target illegal content and aim in particular to guarantee different public policy objectives such as consumer protection, the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities.
2021/07/08
Committee: IMCO
Amendment 353 #
Proposal for a regulation
Recital 35
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature and size of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non- arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.
2021/07/08
Committee: IMCO
Amendment 358 #
Proposal for a regulation
Recital 36 a (new)
(36a) Providers of intermediary services should also establish a single point of contact for recipients of services, allowing rapid, direct and efficient communication.
2021/07/08
Committee: IMCO
Amendment 362 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. Obligations related to terms and conditions should not oblige a provider of an intermediary service to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets or intellectual property rights.
2021/07/08
Committee: IMCO
Amendment 372 #
Proposal for a regulation
Recital 39
(39) To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or, small or medium sized enterprises as defined in Commission Recommendation 2003/361/EC.40 __________________ 40 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/08
Committee: IMCO
Amendment 376 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible, comprehensive and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action')following the applicable law ('action'). Such mechanisms should be clearly visible on the interface of the hosting service and easy to use. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. Providers of hosting services could, as a voluntary measure, conduct own-investigation measures to prevent content which has previously been identified as illegal from being disseminated again once removed. The obligations related to notice and action should by no means impose general monitoring obligations.
2021/07/08
Committee: IMCO
Amendment 385 #
Proposal for a regulation
Recital 41
(41) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non- discrimination of parties affected by illegal content. Providers of hosting services should act upon notices without undue delay, taking into account the type of illegal content that is being notified and the urgency of taking action. The provider of hosting services should inform the individual or entity notifying the specific content of its decision without undue delay after taking a decision whether to act upon the notice or not.
2021/07/08
Committee: IMCO
Amendment 396 #
Proposal for a regulation
Recital 42 a (new)
(42a) A hosting service provider may in some instances become aware, for instance through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the hosting service provider is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council. In such instances, the hosting service provider should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by hosting service providers. Hosting service providers should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities.
2021/07/08
Committee: IMCO
Amendment 401 #
Proposal for a regulation
Recital 43
(43) To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or, small or medium sized enterprises as defined in Recommendation 2003/361/EC of the Commission,41 unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. __________________ 41 Commission Recommendation 2003/361/EC of 6 May 2003 concerning the definition of micro, small and medium- sized enterprises (OJ L 124, 20.5.2003, p. 36).
2021/07/08
Committee: IMCO
Amendment 403 #
Proposal for a regulation
Recital 43 a (new)
(43a) To similarly avoid unnecessary regulatory burdens, certain obligations should not apply to hosting service providers often referred to as closed online platforms where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
2021/07/08
Committee: IMCO
Amendment 405 #
Proposal for a regulation
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift, non- discriminatory and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint- handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effectivimple, affordable, expedient and accessible manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/07/08
Committee: IMCO
Amendment 416 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority,out delay and in accordance with the rules of the profession but without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/08
Committee: IMCO
Amendment 417 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, depending on the severity of the illegal activity, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and private or semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressionscontent online. For intellectual property rights, organisations of industry and of individual right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/08
Committee: IMCO
Amendment 418 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. Providers of hosting services could, as a voluntary measure, introduce own-investigation measures to prevent accounts which have previously been identified as illegal from reappearing once removed. The obligations related to notice and action should by no means impose general monitoring obligations. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/08
Committee: IMCO
Amendment 425 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44 Directive Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).deleted 2011/93/EU of the European
2021/07/08
Committee: IMCO
Amendment 445 #
Proposal for a regulation
Recital 50
(50) To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System45 ,and the Union Rapid Alert System for dangerous non-food products (Rapex) or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council46 , Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48 . __________________ 45 https://ec.europa.eu/taxation_customs/vies/ vieshome.do?selectedLanguage=en 46Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council 47Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to- consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (‘Unfair Commercial Practices Directive’) 48Directive 98/6/EC of the European Parliament and of the Council of 16 February 1998 on consumer protection in the indication of the prices of products offered to consumers
2021/07/08
Committee: IMCO
Amendment 455 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However,Online advertising is a significant source of financing for many digital business models and an effective tool to reach new customers, not least for small- and medium sized companies. However, there are some instances when online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. To ensure consumer protection online advertisement should be subject to proportionate and meaningful transparency obligations. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/07/08
Committee: IMCO
Amendment 469 #
Proposal for a regulation
Recital 54
(54) Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses could have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and meansAccordingly, the number of average monthly recipients of the service should reflect the recipients actually reached by the service either by being exposed to content or by providing content disseminated on the platforms’ interface in that period of time. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. The threshold should be designed to target the largest platforms with a reach in the Union that could lead to a systemic impact. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means, placing such due diligence obligations on smaller companies, especially micro, small and medium sized companies would be disproportionate.
2021/07/08
Committee: IMCO
Amendment 474 #
Proposal for a regulation
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design of their services is generally optimised to benefit their often advertising- driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, withoutsometimes amplify the dissemination of illegal content. Effective regulation and enforcement is needed to effectively identifying and mitigatinge the risks and the societal and economic harm they can cauat may arise. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
2021/07/08
Committee: IMCO
Amendment 490 #
Proposal for a regulation
Recital 61
(61) The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation, without prejudice to its freedom to conduct a business and, in particular, its ability to design and implement effective measures that are aligned with its specific business model. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delayin 30 days following its adoption, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken.
2021/07/08
Committee: IMCO
Amendment 496 #
Proposal for a regulation
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. Often, they facilitate the search for relevant content for recipients of the service and contribute to an improved user experience. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them through making active choices. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them and why. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
2021/07/08
Committee: IMCO
Amendment 500 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms could pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
2021/07/08
Committee: IMCO
Amendment 506 #
Proposal for a regulation
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers, where relevant to a research project. All requiremenests for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
2021/07/08
Committee: IMCO
Amendment 520 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fakintentionally inaccurate or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerablecertain groups of recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
2021/07/08
Committee: IMCO
Amendment 605 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights with special focus on most vulnerable as children and disabled persons enshrined in the Charter are effectively protected.
2021/07/08
Committee: IMCO
Amendment 613 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.
2021/07/08
Committee: IMCO
Amendment 614 #
Proposal for a regulation
Article 1 – paragraph 2 – point b – point i (new)
i) facilitate innovations, support digital transition, encourage economic growth and create a level playing field for digital services within the internal market while strengthening consumer protection and contributing to increased consumer choice.
2021/07/08
Committee: IMCO
Amendment 659 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
— a significant number of users in one or more Member States; ordeleted
2021/07/08
Committee: IMCO
Amendment 665 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 2
— the targedirecting of activities towards one or more Member States.
2021/07/08
Committee: IMCO
Amendment 669 #
Proposal for a regulation
Article 2 – paragraph 1 – point e
(e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;
2021/07/08
Committee: IMCO
Amendment 681 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services but in particular fake online profile account, is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
2021/07/08
Committee: IMCO
Amendment 686 #
Proposal for a regulation
Article 2 – paragraph 1 – point g
(g) ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law;
2021/07/08
Committee: IMCO
Amendment 697 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider (h) of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. Infrastructure services such as webhosting or cloud service providers shall not be covered by the definition of online platforms;
2021/07/08
Committee: IMCO
Amendment 700 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(ha) ‘editorial platform’ means an intermediary service which is in connection with a press publication within the meaning of Article 2(4) of Directive (EU) 2019/790 or another editorial media service and which allows users to discuss topics generally covered by the relevant media or to comment editorial content and which is under the supervision of the editorial team of the publication or other editorial media.
2021/07/08
Committee: IMCO
Amendment 720 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed and disseminated to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically in exchange for promoting that information;
2021/07/08
Committee: IMCO
Amendment 728 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
2021/07/08
Committee: IMCO
Amendment 761 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expedwitihouslt undue delay to remove or to disable access to the illegal content.
2021/07/08
Committee: IMCO
Amendment 787 #
Proposal for a regulation
Article 6 – paragraph 1
Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry outtake the necessary voluntary own-initiative investigations or other activiti measures aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation, without prejudice to freedom of expression.
2021/07/08
Committee: IMCO
Amendment 790 #
Proposal for a regulation
Article 6 – paragraph 1 a (new)
Providers of intermediary services shall ensure that such measures are accompanied with appropriate safeguards, such as oversight, documentation and traceability or additional measures to ensure that own- initiative investigations are accurate, legally justified and do not lead to over- removal of content.
2021/07/08
Committee: IMCO
Amendment 795 #
Proposal for a regulation
Article 7 – paragraph 1
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers, unless the information society service plays an active role in approving, modifying or editing the information issued by the recipient of the service.
2021/07/08
Committee: IMCO
Amendment 830 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
(ba) the territorial scope of an order addressed to a provider that has its main establishment or, if the provider is not established in the Union, its legal representation in another Member State is limited to the territory of the Member State issuing the order;
2021/07/08
Committee: IMCO
Amendment 833 #
Proposal for a regulation
Article 8 – paragraph 2 – point b b (new)
(bb) if addressed to a provider that has its main establishment outside the Union, the territorial scope of the order, where Union law is infringed, is limited to the territory of the Union or, where national law is infringed, to the territory of the Member State issuing the order;
2021/07/08
Committee: IMCO
Amendment 838 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
(c) the order is drafted in English or the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10.
2021/07/08
Committee: IMCO
Amendment 857 #
Proposal for a regulation
Article 8 a (new)
Article 8a Injunction orders Member States shall ensure that recipients of a service are entitled under their national law to seek an injunction order as an interim measure for removing manifestly illegal content.
2021/07/08
Committee: IMCO
Amendment 897 #
Proposal for a regulation
Article 10 – title
Points of contact for authorities, the Commission and the Board
2021/07/08
Committee: IMCO
Amendment 903 #
Proposal for a regulation
Article 10 – paragraph 2
2. Providers of intermediary services shall make publiccommunicate to their Digital Service Coordinator of establishment, the Commission and the Board the information necessary to easily identify and communicate with their single points of contact.
2021/07/08
Committee: IMCO
Amendment 908 #
Proposal for a regulation
Article 10 a (new)
Article 10a Point of contact for recipients of a service 1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with the recipients of their services. The means of communication shall be user-friendly and easily accessible. 2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact for recipients.
2021/07/08
Committee: IMCO
Amendment 918 #
Proposal for a regulation
Article 11 – paragraph 4 a (new)
4a. Providers of intermediary services that would qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC if established in the Union, and who have been unsuccessful in designating a legal representative after reasonable efforts, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including the possibility for collective representation.
2021/07/08
Committee: IMCO
Amendment 925 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including information about algorithmic decision-making and human review. ItProviders of intermediary services shall also include information on the right to terminate the use of the service. The possibility to terminate must be easily accessible for the user. Information on remedies and redress mechanisms shall also be included in the terms and conditions. The terms and conditions shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
2021/07/08
Committee: IMCO
Amendment 950 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Obligations pursuant to paragraph 1 and 2 should not oblige a provider of an intermediary service to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets or intellectual property rights.
2021/07/08
Committee: IMCO
Amendment 971 #
Proposal for a regulation
Article 12 a (new)
Article 12a Exclusions Articles 12 and 13 of Section 1, and the provisions of Section 2, and Section 3 of Chapter III shall not apply to: (a) editorial platforms within the meaning of Article 2(h) of this Regulation; (b) online platforms that qualify as micro and medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC.
2021/07/08
Committee: IMCO
Amendment 975 #
Proposal for a regulation
Article 13 – paragraph 1 – introductory part
1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular,including information on the following, as applicable:
2021/07/08
Committee: IMCO
Amendment 989 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c) meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures;
2021/07/08
Committee: IMCO
Amendment 994 #
Proposal for a regulation
Article 13 – paragraph 1 – point d a (new)
(da) All providers of intermediary services that are likely to be accessed by children must include provisions and resources in place to safeguard children’s rights and wellbeing as described in the UN Convention on the Rights of the Child and the Convention’s General Comment 25. The impact of services on children must be assessed regularly and children’s rights and wellbeing embedded in the design of services’ updates and innovation.
2021/07/08
Committee: IMCO
Amendment 1002 #
Proposal for a regulation
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC, small or medium sized enterprises (SMEs) within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a medium-sized, small or micro-enterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status pursuant to Article 4(2) thereof.
2021/07/08
Committee: IMCO
Amendment 1009 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
2a. Paragraph 1 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
2021/07/08
Committee: IMCO
Amendment 1060 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 on the basis of which a diligent provider of hosting services is able to assess the illegality of the content in question, shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned.
2021/07/08
Committee: IMCO
Amendment 1064 #
Proposal for a regulation
Article 14 – paragraph 4
4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly, without undue delay, send a confirmation of receipt of the notice to that individual or entity.
2021/07/08
Committee: IMCO
Amendment 1081 #
Proposal for a regulation
Article 14 – paragraph 6 a (new)
6a. Providers of hosting services could, as a voluntary measure in line with provisions Article 6, conduct own- investigation measures to prevent illegal content which has previously been identified as illegal from being disseminated again once removed. The obligations related to paragraph 1 to 6 shall by no means impose general monitoring obligations on hosting services.
2021/07/08
Committee: IMCO
Amendment 1087 #
Proposal for a regulation
Article 14 – paragraph 6 b (new)
6b. Paragraphs 2, 4 and 5 shall not apply to providers of intermediary services that qualify as micro, small or medium- sized enterprises (SMEs) within the meaning of the Annex to Recommendations 2003/361/EU, or to those enterprises within twelve months of them losing such status pursuant to Article 4(2) thereof.
2021/07/08
Committee: IMCO
Amendment 1089 #
Proposal for a regulation
Article 14 – paragraph 6 c (new)
6c. Paragraph 2 and 4-5 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
2021/07/08
Committee: IMCO
Amendment 1096 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services decides to remove or disable access to or radically restrict the visibility of specific items of information provided by the recipients of the service, or to suspend or terminate monetary payments related to those items, irrespective of the means used for detecting, identifying or removing or disabling access to or for restricting the visibility or monetisation of that information and of the reason for its decision, it shall inform the recipient, at the latest at the time ofwithout undue delay and at the latest within 24 hours after the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
2021/07/08
Committee: IMCO
Amendment 1102 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the or radical restriction of the visibility of, the information or the suspension or termination of monetary payments related to that information and, where relevant, the territorial scope of the disabling of access;
2021/07/08
Committee: IMCO
Amendment 1120 #
Proposal for a regulation
Article 15 – paragraph 4
4. Providers of hosting services shall publishupon request share the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commissionwith the Digital Service Coordinator of establishment. That information shall not contain personal data.
2021/07/08
Committee: IMCO
Amendment 1122 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
4a. Paragraph 2 to 4 shall not apply to providers of intermediary services that qualify as micro, small or medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC, or during the first twelve months from when an enterprise lost such status as pursuant to Article 4(2) thereof.
2021/07/08
Committee: IMCO
Amendment 1124 #
Proposal for a regulation
Article 15 – paragraph 4 b (new)
4b. Paragraph 2 to 4 shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
2021/07/08
Committee: IMCO
Amendment 1129 #
Proposal for a regulation
Article 15 a (new)
Article 15a Notification of suspicions of criminal offences 1. Where a provider of hosting services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 2. Where the provider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative and Europol. For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.
2021/07/08
Committee: IMCO
Amendment 1137 #
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or, small or medium sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC. , nor during the first twelve months to such enterprises following the loss of such status pursuant to Article 4(2) thereof. This section shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
2021/07/08
Committee: IMCO
Amendment 1157 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions to remove or not to remove or disable access to the information;
2021/07/08
Committee: IMCO
Amendment 1158 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b) decisions to suspend or terminate or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
2021/07/08
Committee: IMCO
Amendment 1161 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) decisions to suspend or terminate or not to suspend or terminate the recipients’ account.
2021/07/08
Committee: IMCO
Amendment 1166 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to radically restrict the visibility of content provided by the recipients,
2021/07/08
Committee: IMCO
Amendment 1171 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
(cb) decisions to restrict the ability to monetise content provided by the recipients,
2021/07/08
Committee: IMCO
Amendment 1182 #
Proposal for a regulation
Article 17 – paragraph 3
3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, and diligent mandner, objective mannerly and in accordance with the rules of the profession. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
2021/07/08
Committee: IMCO
Amendment 1189 #
Proposal for a regulation
Article 17 – paragraph 4 a (new)
4a. Online platforms shall ensure that the decisions, referred to in paragraph 4 are taken by lawyers with at least five years of professional experience.
2021/07/08
Committee: IMCO
Amendment 1194 #
Proposal for a regulation
Article 17 – paragraph 5
5. The decisions referred to in paragraph 4 may exceptionally be taken by automated means in which case, the Online platforms shall ensure that thesuch decisions, referred to in paragraph 4, are not solely taken on the basis of automated means are supervised by lawyers with at least five years of professional experience.
2021/07/08
Committee: IMCO
Amendment 1203 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1) and individuals or entities that have submitted notices, shall be entitled to select any out- of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
2021/07/08
Committee: IMCO
Amendment 1213 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point a
(a) it is impartial and independentndependent, including financially independent, and impartial of online platforms and recipients of the service provided by the online platforms and of individuals or entities that have submitted notices;
2021/07/08
Committee: IMCO
Amendment 1221 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible through electronic communication technology and provides for the possibility to submit a complaint and the requisite supporting documents online;
2021/07/08
Committee: IMCO
Amendment 1222 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point c
(c) the dispute settlement is easily accessible, including for persons with disabilities, through electronic communication technology;
2021/07/08
Committee: IMCO
Amendment 1232 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point d
(d) it is capable of settling dispute in a swift, efficient, accessible for persons with disabilities and cost-effective manner and in at least one official language of the Union;
2021/07/08
Committee: IMCO
Amendment 1236 #
Proposal for a regulation
Article 18 – paragraph 2 – subparagraph 1 – point e
(e) the dispute settlement takes place in accordance with clear and fair rules of procedure that are clearly visible and easily accessible to all parties concerned and in full compliance with all applicable law.
2021/07/08
Committee: IMCO
Amendment 1242 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
2a. The Digital Services Coordinator shall reassess on a yearly basis whether the certified out-of-court dispute settlement body continues to fulfil the listed criteria. If this is not the case, the Digital Services Coordinator shall revoke the status from the out-of-court dispute settlement body.
2021/07/08
Committee: IMCO
Amendment 1251 #
Proposal for a regulation
Article 18 – paragraph 5
5. Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph as well as out-of-court dispute settlement bodies whose status has been revoked. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated.
2021/07/08
Committee: IMCO
Amendment 1262 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by certified trusted flaggers, within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay, depending on the severity of the illegal activity.
2021/07/08
Committee: IMCO
Amendment 1264 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay.
2021/07/08
Committee: IMCO
Amendment 1265 #
Proposal for a regulation
Article 19 – paragraph 1 a (new)
1a. The notices submitted by trusted flaggers do not in any way affect other notices. All notices submitted under those mechanisms referred to in Article 14 are processed and decided upon without delay and in accordance with the rules of the profession.
2021/07/08
Committee: IMCO
Amendment 1278 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform;
2021/07/08
Committee: IMCO
Amendment 1296 #
Proposal for a regulation
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or have been revoked in accordance with paragraph 6.
2021/07/08
Committee: IMCO
Amendment 1308 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, carried out without undue delay, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
2021/07/08
Committee: IMCO
Amendment 1316 #
Proposal for a regulation
Article 19 – paragraph 7 a (new)
7a. Notices submitted by local, regional and national authorities shall be processed and decided upon with an equivalent degree of priority and delay as the notices provided by entities, which have been awarded a trusted flagger status.
2021/07/08
Committee: IMCO
Amendment 1334 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past yeara given time frame;
2021/07/08
Committee: IMCO
Amendment 1336 #
Proposal for a regulation
Article 20 – paragraph 3 – point b
(b) the relative proportion thereof in relation to the total number of items of information provided or notices submitted in the past yeara given time frame;
2021/07/08
Committee: IMCO
Amendment 1339 #
Proposal for a regulation
Article 20 – paragraph 3 – point d
(d) where identifiable, the intention of the recipient, individual, entity or complainant.
2021/07/08
Committee: IMCO
Amendment 1349 #
Proposal for a regulation
Article 20 – paragraph 4 a (new)
4a. Providers of hosting services could, as a voluntary measure in line with provisions Article 6, conduct own- investigation measures to prevent suspended accounts from reappearing before the suspension is lifted. The obligations related to paragraph 1 to 4 shall by no means impose general monitoring obligations on hosting services.
2021/07/08
Committee: IMCO
Amendment 1352 #
Proposal for a regulation
Article 21
Notification of suspicions of criminal 1. aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 2. identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.Article 21 deleted offences Where an online platform becomes Where the online platform cannot
2021/07/08
Committee: IMCO
Amendment 1353 #
Proposal for a regulation
Article 21 – paragraph 1
1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available.deleted
2021/07/08
Committee: IMCO
Amendment 1358 #
Proposal for a regulation
Article 21 – paragraph 2
2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.deleted
2021/07/08
Committee: IMCO
Amendment 1361 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 2
For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.deleted
2021/07/08
Committee: IMCO
Amendment 1381 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
(c) the bank account details of the trader, where the trader is a natural person;deleted
2021/07/08
Committee: IMCO
Amendment 1391 #
Proposal for a regulation
Article 22 – paragraph 1 – point f
(f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law and where applicable confirming that all products have been checked against the Union Rapid Alert System for dangerous non-food products (Rapex).
2021/07/08
Committee: IMCO
Amendment 1404 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) (e) and (ef) of paragraph 1 is reliable through the use of any freely accessible official online database, like the Rapex system or online interfaces made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. The online platform shall require that traders promptly inform them of any changes to the information referred to in points (a), (d), (e) and (f) and regularly repeat this verification process.
2021/07/08
Committee: IMCO
Amendment 1413 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
Where the online platform obtains indications that anyinformation under paragraph 1, letter (f) is inaccurate it shall remove the product or service directly from their online platform and if any other item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/07/08
Committee: IMCO
Amendment 1459 #
Proposal for a regulation
Article 22 a (new)
Article 22a Obligation to inform consumers and authorities about illegal products and services 1. Where an online platform allows consumers to conclude distance contracts with traders, it shall be subject to additional information obligations for consumers. Where the online platform becomes aware of the illegal nature of a product or services offered by a trader on its interface it shall: (a) immediately remove the illegal product from its interface and inform relevant authorities about it; (b) maintain an internal database of content removed and/or recipients suspended pursuant to Article 20 to be used by internal content moderation systems tackling the identified risks; (c) where the online platform has the contact details of the recipients of its services, inform such recipients of the service that have purchased said product or service during the past twelve months about the illegality, the identity of the trader and options for seeking redress; (d) compile and make publicly available through application programming interfaces a repository containing information about illegal products and services removed from its platform in the past six months along with information about the concerned trader and options for seeking redress.
2021/07/08
Committee: IMCO
Amendment 1470 #
Proposal for a regulation
Article 23 – paragraph 1 – point c
(c) any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied.
2021/07/08
Committee: IMCO
Amendment 1473 #
Proposal for a regulation
Article 23 – paragraph 2
2. Online platforms shall pucommunicate to the Digital Services Coordinator of establishment, at least once every sixtwelve months, information on the average monthly active recipients of the service in each Member Statethe Union, calculated as an average over the period of the past sixtwelve months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2).
2021/07/08
Committee: IMCO
Amendment 1476 #
Proposal for a regulation
Article 23 – paragraph 2 a (new)
2a. Member States shall refrain from imposing additional transparency reporting obligations on the online platforms, other than specific requests in the context of exercising their supervisory powers.
2021/07/08
Committee: IMCO
Amendment 1503 #
Proposal for a regulation
Article 24 – paragraph 1 – point c a (new)
(ca) contracted amount of payment for online advertising expressed in euros if it is a paid advertisement.
2021/07/08
Committee: IMCO
Amendment 1510 #
Proposal for a regulation
Article 24 – paragraph 1 a (new)
2. Online platforms shall provide information mentioned in paragraph 1 to public authorities, upon their request, in order to determine accountability in case of false or misleading advertisement.
2021/07/08
Committee: IMCO
Amendment 1512 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
3. Providers of intermediary services shall obtain consent from the recipients of their service, in order to provide them with micro targeted and behavioural advertisement. Providers of intermediary services shall ensure that recipients of services can easily make an informed choice when expressing their consent by providing them with meaningful information.
2021/07/08
Committee: IMCO
Amendment 1534 #
Proposal for a regulation
Article 25 – paragraph 1 a (new)
1a. This section shall not apply where, within the framework of an organised distribution network operating under a common brand, the provider of the intermediary service has a direct organisational, associative, cooperative or capital ownership link with the recipient of the service or where the intermediary service solely aims to intermediate content between the members of the organised distribution framework and their suppliers.
2021/07/08
Committee: IMCO
Amendment 1552 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use madedissemination of illegal content ofn their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1567 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively through dissemination of illegal content;
2021/07/08
Committee: IMCO
Amendment 1577 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative and illegal effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
2021/07/08
Committee: IMCO
Amendment 1587 #
Proposal for a regulation
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
2021/07/08
Committee: IMCO
Amendment 1596 #
Proposal for a regulation
Article 26 – paragraph 2 a (new)
2a. The obligations detailed in paragraphs 1 and 2 shall by no means lead to a general monitoring obligation
2021/07/08
Committee: IMCO
Amendment 1604 #
Proposal for a regulation
Article 27 – paragraph 1 – introductory part
1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures targeting illegal practices, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable:
2021/07/08
Committee: IMCO
Amendment 1661 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
(b) any voluntary commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.
2021/07/08
Committee: IMCO
Amendment 1665 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned and have not provided any other service to the platform in the previous 12 months;
2021/07/08
Committee: IMCO
Amendment 1702 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2a. Obligations pursuant to paragraphs 1 and 2 shall not oblige a very large online platform to disclose information that will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets and intellectual property rights. Further, very large online platforms shall not be required to enable modification of systems essential to uphold the safety and security of the service.
2021/07/08
Committee: IMCO
Amendment 1719 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one yearsix months after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/08
Committee: IMCO
Amendment 1767 #
Proposal for a regulation
Article 31 – paragraph 4
4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, disclose the funding of the research, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request.
2021/07/08
Committee: IMCO
Amendment 1783 #
Proposal for a regulation
Article 31 – paragraph 7
7. Requests for amendment pursuant to point (b) of paragraph 6 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request. The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request.deleted
2021/07/08
Committee: IMCO
Amendment 1798 #
Proposal for a regulation
Article 33 – paragraph 1
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every sixtwelve months.
2021/07/08
Committee: IMCO
Amendment 1808 #
Proposal for a regulation
Article 33 a (new)
Article 33a Algorithm transparency 1. When using automated decision making, the very large online platform shall upon request provide the Commission with the necessary information to assess the algorithms used. 2. When carrying out the assessments referred to in paragraph 1, the Commission shall consider the following elements: (a) the compliance with corresponding Union requirements; (b) potential negative effects on fundamental rights, including on consumer rights, through dissemination of illegal content; 3. Following an assessment the Commission shall communicate its findings to the very large online platform and allow it to provide additional explanation. 4. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a) or (b) of paragraph 2 of this Article, the Commission shall inform the Digital Service Coordinator of establishment of the very large online platform.
2021/07/08
Committee: IMCO
Amendment 1829 #
Proposal for a regulation
Article 34 – paragraph 1 – point f a (new)
(fa) protection and promotion of children’s rights and wellbeing harmonized with the UN Convention on the Rights of the Child and the Convention’s General Comment 25.
2021/07/08
Committee: IMCO
Amendment 1831 #
Proposal for a regulation
Article 34 – paragraph 1 – point f b (new)
(fb) accessibility of elements and functions of online platforms and digital services for persons with disabilities aiming at consistency and coherence with existing harmonised accessibility requirements when these elements and functions are not already covered by existing harmonised European standards;
2021/07/08
Committee: IMCO
Amendment 1838 #
Proposal for a regulation
Article 34 – paragraph 2
2. The Commission shall support the update of the standards and guidelines in the light of technological and legislation developments and the behaviour of the recipients of the services in question.
2021/07/08
Committee: IMCO
Amendment 1846 #
Proposal for a regulation
Article 35 – paragraph 1
1. The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. The Commission shall also encourage and facilitate regular review and adaption of the Codes of conduct to ensure that they are fit for purpose.
2021/07/08
Committee: IMCO
Amendment 1853 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested partierelevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/07/08
Committee: IMCO
Amendment 1864 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments should take into account differences in size and capacity between different participants.
2021/07/08
Committee: IMCO
Amendment 1866 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, in relation to the dissemination of illegal content, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.
2021/07/08
Committee: IMCO
Amendment 1883 #
Proposal for a regulation
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30.
2021/07/08
Committee: IMCO
Amendment 1897 #
Proposal for a regulation
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health.
2021/07/08
Committee: IMCO
Amendment 1945 #
Proposal for a regulation
Article 41 – paragraph 2 – subparagraph 1 – point e
(e) the power to adopt proportionate interim measures to avoid the risk of serious harm, without prejudice to fundamental rights.
2021/07/08
Committee: IMCO
Amendment 1978 #
Proposal for a regulation
Article 44 – paragraph 2 – point b a (new)
(ba) the conditions met to justify any order to act against illegal content and to provide information taken that derogates from the internal market clause in accordance with Article 3 of Directive 2000/31/EC.
2021/07/08
Committee: IMCO
Amendment 2039 #
Proposal for a regulation
Article 47 – paragraph 2 – point a a (new)
(aa) contributing to the effective application of Article 3 of Directive 2000/31/EC to prevent fragmentation of the digital single market;
2021/07/08
Committee: IMCO
Amendment 2088 #
Proposal for a regulation
Article 49 – paragraph 1 – point d a (new)
(da) monitor derogations from the internal market clause in accordance with Article 3 of Directive 2000/31/EC and ensure that the conditions for derogation are interpreted strictly and narrowly to ensure consistent application of this Regulation;
2021/07/08
Committee: IMCO
Amendment 2089 #
Proposal for a regulation
Article 49 – paragraph 1 – point e
(e) support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in close collaboration with relevant stakeholders as provided for in this Regulation, as well as the identification of emerging issues, with regard to matters covered by this Regulation.
2021/07/08
Committee: IMCO
Amendment 2164 #
Proposal for a regulation
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order proportionate interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement, without prejudice to fundamental rights.
2021/07/08
Committee: IMCO
Amendment 2182 #
Proposal for a regulation
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, and where necessary access to, its databases and algorithms.
2021/07/08
Committee: IMCO
Amendment 2209 #
Proposal for a regulation
Article 59 – paragraph 1 – introductory part
1. In the decision pursuant to Article 58, the Commission mayshall impose on the very large online platform concerned fines not exceeding 6% of its total turnover in the preceding financial year where it finds that that platform, intentionally or negligently:
2021/07/08
Committee: IMCO
Amendment 2212 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
2. The Commission may by decision and in compliance with the proportionality principle impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or as a result of repeated negligentlyce:
2021/07/08
Committee: IMCO
Amendment 2296 #
Proposal for a regulation
Article 74 – paragraph 2
2. It shall apply from [date - threwelve months after its entry into force].
2021/07/08
Committee: IMCO