BETA

Activities of Andrea CAROPPO related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (74)

Amendment 208 #
Proposal for a regulation
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targedirecting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targedirecting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. __________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
2021/07/08
Committee: IMCO
Amendment 236 #
Proposal for a regulation
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the service is provided by cooperative organisations exclusively to their members established in the European Union with whom they have a direct organisational, cooperative or capital ownership link within the framework of an organised distribution network operating publicly under a common brand, or where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
2021/07/08
Committee: IMCO
Amendment 259 #
Proposal for a regulation
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic and passive processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider or where the provider prioritises or promotes the content, its presentation or monetisation beyond offering basic search and indexing functionalities that are absolutely necessary to navigate the content.
2021/07/08
Committee: IMCO
Amendment 264 #
Proposal for a regulation
Recital 18 a (new)
(18a) Those exemptions from liability should also not be available to providers of intermediary services that do not comply with the due diligence obligations in this Regulation. The conditionality should further ensure that the standards to qualify for those exemptions contribute to a high level of safety and trust in the online environment in a manner that promotes a fair balance of the rights of all stakeholders.
2021/07/08
Committee: IMCO
Amendment 267 #
Proposal for a regulation
Recital 20
(20) A provider of intermediary services that deliberately collaboratengages with a recipient of the services in order to undertake illegal activities does not provide its service neutrally nor passively and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
2021/07/08
Committee: IMCO
Amendment 284 #
Proposal for a regulation
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders,Hosting services should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platformsey present the relevant information relating to the transactions or exchanges at issue in such a way that it leads consumers to believe that the information was provided by those online platformservices providers themselves or by recipients of the service acting under their authority or control, and that those online platformservices providers thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well- informed consumer.
2021/07/08
Committee: IMCO
Amendment 296 #
Proposal for a regulation
Recital 25
(25) In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon.deleted
2021/07/08
Committee: IMCO
Amendment 325 #
Proposal for a regulation
Recital 29
(29) Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders.
2021/07/08
Committee: IMCO
Amendment 329 #
Proposal for a regulation
Recital 30
(30) Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulationn Member States to impose a monitoring obligation of a general nature. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non- disclosure of information.
2021/07/08
Committee: IMCO
Amendment 335 #
Proposal for a regulation
Recital 31
(31) The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity.
2021/07/08
Committee: IMCO
Amendment 337 #
Proposal for a regulation
Recital 32
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. This information should include the relevant contact details necessary to ensure such compliance. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
2021/07/08
Committee: IMCO
Amendment 358 #
Proposal for a regulation
Recital 36 a (new)
(36a) Providers of intermediary services should also establish a single point of contact for recipients of services, allowing rapid, direct and efficient communication.
2021/07/08
Committee: IMCO
Amendment 392 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should prevent the reappearance of the notified or equivalent illegal information. The provider should also inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/07/08
Committee: IMCO
Amendment 396 #
Proposal for a regulation
Recital 42 a (new)
(42a) A hosting service provider may in some instances become aware, for instance through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the hosting service provider is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council. In such instances, the hosting service provider should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by hosting service providers. Hosting service providers should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities.
2021/07/08
Committee: IMCO
Amendment 400 #
Proposal for a regulation
Article 19 – paragraph 1
1. Online platformThe providers of hosting services shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decidacted upon a face value with priority and without delay, and in appropriate circumstances, immediately.
2021/06/24
Committee: ITRE
Amendment 405 #
Proposal for a regulation
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift, non- discriminatory and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint- handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effectivimple, affordable, expedient and accessible manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/07/08
Committee: IMCO
Amendment 414 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collectivve significant legitimate interests, and that they work in a diligent and objective manner proven record in flagging content with a high rate of accuracy and have demonstrated competence for the purposes of detecting, identifying and notifying illegal content. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/07/08
Committee: IMCO
Amendment 421 #
Proposal for a regulation
Article 20 – paragraph 1
1. Online platformProviders of intermediary services shall suspend, for a reasonable period of time, or in appropriate circumstances terminate, and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content.
2021/06/24
Committee: ITRE
Amendment 425 #
Proposal for a regulation
Article 20 – paragraph 2
2. Online platformProviders of intermediary services shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints- handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.
2021/06/24
Committee: ITRE
Amendment 425 #
Proposal for a regulation
Recital 48
(48) An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council44 . In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. __________________ 44 Directive Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).deleted 2011/93/EU of the European
2021/07/08
Committee: IMCO
Amendment 428 #
Proposal for a regulation
Article 20 – paragraph 3 – introductory part
3. Online platformProviders of intermediary services shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following:
2021/06/24
Committee: ITRE
Amendment 431 #
Proposal for a regulation
Article 20 – paragraph 3 – point a
(a) the absolute numbers of items of manifestly illegal content or manifestknowingly unfounded notices or complaints, submitted in the past year;
2021/06/24
Committee: ITRE
Amendment 433 #
Proposal for a regulation
Article 20 – paragraph 4
4. Online platformProviders of intermediary services shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension and the circumstances in which they will terminate their services.
2021/06/24
Committee: ITRE
Amendment 440 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following informationA provider of intermediary services shall ensure it has obtained the following information from a trader before starting the use of its services:
2021/06/24
Committee: ITRE
Amendment 454 #
Proposal for a regulation
Article 22 – paragraph 2
2. The online platformprovider of intermediary services shall, upon receiving that information, mtake reasonable effortseffective steps that would reasonably be taken by a diligent operator in accordance with a high industry standard of professional diligence to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is accurate, current and reliable through the use of independent and reliable sources including any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. The provider of intermediary services should require that traders promptly inform them of any changes to the information referred to in points (a), (d) and (e) and regularly repeat this verification process at least once per year. The provider of intermediary services should ensure that any trader, against whom the measure set out in Article 20(1) was applied, is not permitted to use the service, including under a different name.
2021/06/24
Committee: ITRE
Amendment 463 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 1
3. Where the online platformprovider of intermediary services obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate, out of date or incomplete, that platformrovider shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law.
2021/06/24
Committee: ITRE
Amendment 465 #
Proposal for a regulation
Article 22 – paragraph 3 – subparagraph 2
Where the trader fails to correct or complete that information, the online platformprovider of intermediary services shall suspend the provision of its service to the trader until the request is complied with.
2021/06/24
Committee: ITRE
Amendment 468 #
Proposal for a regulation
Article 22 – paragraph 4
4. The online platformprovider of intermediary services shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information.
2021/06/24
Committee: ITRE
Amendment 469 #
Proposal for a regulation
Article 22 – paragraph 5
5. Without prejudice to paragraph 2, the platformrovider of intermediary services shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation, and where the interested parties need to access information for the legitimate purpose of investigating infringements and enforcing their rights.
2021/06/24
Committee: ITRE
Amendment 472 #
Proposal for a regulation
Article 22 – paragraph 6
6. The online platformprovider of intermediary services shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner.
2021/06/24
Committee: ITRE
Amendment 474 #
Proposal for a regulation
Article 22 – paragraph 7
7. The online platformprovider of intermediary services shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law.
2021/06/24
Committee: ITRE
Amendment 676 #
Proposal for a regulation
Article 2 – paragraph 1 – point f – indent 3
— a ‘hosting’ service that consists of the storage or the allowance of storage of information provided by, and at the request of, a recipient of the service;
2021/07/08
Committee: IMCO
Amendment 694 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, with the exception of services provided by cooperative organisations exclusively to their members established in the European Union with whom they have a direct organisational, cooperative, associative or capital ownership link within the framework of an organised distribution network operating publicly under a common brand, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.
2021/07/08
Committee: IMCO
Amendment 728 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
2021/07/08
Committee: IMCO
Amendment 747 #
Proposal for a regulation
Article 2 a (new)
Article 2a Conditionality to the compliance with due diligence obligations Providers of intermediary services shall be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 when they do not comply with the due diligence obligations set out in Chapter III of this Regulation.
2021/07/08
Committee: IMCO
Amendment 754 #
Proposal for a regulation
Article 5 – paragraph 1 – introductory part
1. Where an information society service is provided that consists of the storage or the allowance of storage of information provided by a recipient of the service the service provider shall not be liable for the information stored at the request of a recipient of the service on condition that the provider:
2021/07/08
Committee: IMCO
Amendment 771 #
Proposal for a regulation
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platformhosting service provider itself or by a recipient of the service who is acting under its authority or control.
2021/07/08
Committee: IMCO
Amendment 774 #
Proposal for a regulation
Article 5 – paragraph 3 a (new)
3a. Providers of intermediary services shall be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 when their main purpose is to engage in or facilitate illegal activities.
2021/07/08
Committee: IMCO
Amendment 779 #
Proposal for a regulation
Article 6
Voluntary own-initiative investigations Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation.Article 6 deleted and legal compliance
2021/07/08
Committee: IMCO
Amendment 800 #
Proposal for a regulation
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken. Under the condition that necessary safeguards are provided, such orders could, in particular, consist of catalogue- wide and dynamic injunctions by courts or administrative authorities requiring the termination or prevention of any infringement.
2021/07/08
Committee: IMCO
Amendment 821 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 2
one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned;
2021/07/08
Committee: IMCO
Amendment 837 #
Proposal for a regulation
Article 8 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10.
2021/07/08
Committee: IMCO
Amendment 864 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
2021/07/08
Committee: IMCO
Amendment 876 #
Proposal for a regulation
Article 9 – paragraph 2 – point b
(b) the order only requires the provider to provide information already collected for the purposes of providingenabling the identification of recipients of the service and which lies within its control;
2021/07/08
Committee: IMCO
Amendment 881 #
Proposal for a regulation
Article 9 – paragraph 2 – point c
(c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10;
2021/07/08
Committee: IMCO
Amendment 908 #
Proposal for a regulation
Article 10 a (new)
Article 10a Point of contact for recipients of a service 1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with the recipients of their services. The means of communication shall be user-friendly and easily accessible. 2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact for recipients.
2021/07/08
Committee: IMCO
Amendment 918 #
Proposal for a regulation
Article 11 – paragraph 4 a (new)
4a. Providers of intermediary services that would qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC if established in the Union, and who have been unsuccessful in designating a legal representative after reasonable efforts, shall be able to request that the Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including the possibility for collective representation.
2021/07/08
Committee: IMCO
Amendment 925 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including information about algorithmic decision-making and human review. ItProviders of intermediary services shall also include information on the right to terminate the use of the service. The possibility to terminate must be easily accessible for the user. Information on remedies and redress mechanisms shall also be included in the terms and conditions. The terms and conditions shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
2021/07/08
Committee: IMCO
Amendment 989 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c) meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures;
2021/07/08
Committee: IMCO
Amendment 1042 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content;
2021/07/08
Committee: IMCO
Amendment 1058 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. and shall create an obligation on behalf of the notified provider of hosting services to remove or disable access to the notified information expeditiously.
2021/07/08
Committee: IMCO
Amendment 1064 #
Proposal for a regulation
Article 14 – paragraph 4
4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly, without undue delay, send a confirmation of receipt of the notice to that individual or entity.
2021/07/08
Committee: IMCO
Amendment 1072 #
Proposal for a regulation
Article 14 – paragraph 6
6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. When a decision has been taken to remove or disable information, the providers of hosting services shall take all necessary measures to prevent the same or equivalent illegal material from reappearing on their service. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4.
2021/07/08
Committee: IMCO
Amendment 1102 #
Proposal for a regulation
Article 15 – paragraph 2 – point a
(a) whether the decision entails either the removal of, or the disabling of access to, the or radical restriction of the visibility of, the information or the suspension or termination of monetary payments related to that information and, where relevant, the territorial scope of the disabling of access;
2021/07/08
Committee: IMCO
Amendment 1166 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(ca) decisions to radically restrict the visibility of content provided by the recipients,
2021/07/08
Committee: IMCO
Amendment 1171 #
Proposal for a regulation
Article 17 – paragraph 1 – point c b (new)
(cb) decisions to restrict the ability to monetise content provided by the recipients,
2021/07/08
Committee: IMCO
Amendment 1272 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platformhas a significant legitimate interest, either collectively or as individual entity, is independent from any online platform, and has a proven expertise of flagging illegal content with a high rate of accuracy;
2021/07/08
Committee: IMCO
Amendment 1278 #
Proposal for a regulation
Article 19 – paragraph 2 – point b
(b) it represents collective interests and is independent from any online platform;
2021/07/08
Committee: IMCO
Amendment 1296 #
Proposal for a regulation
Article 19 – paragraph 3
3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or have been revoked in accordance with paragraph 6.
2021/07/08
Committee: IMCO
Amendment 1305 #
Proposal for a regulation
Article 19 – paragraph 5
5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiatedwrongful notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents.
2021/07/08
Committee: IMCO
Amendment 1308 #
Proposal for a regulation
Article 19 – paragraph 6
6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, carried out without undue delay, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger
2021/07/08
Committee: IMCO
Amendment 1512 #
Proposal for a regulation
Article 24 – paragraph 1 b (new)
3. Providers of intermediary services shall obtain consent from the recipients of their service, in order to provide them with micro targeted and behavioural advertisement. Providers of intermediary services shall ensure that recipients of services can easily make an informed choice when expressing their consent by providing them with meaningful information.
2021/07/08
Committee: IMCO
Amendment 1554 #
Proposal for a regulation
Article 26 – paragraph 1 – introductory part
1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
2021/07/08
Committee: IMCO
Amendment 1557 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a) the dissemination and amplification of illegal content through their services;
2021/07/08
Committee: IMCO
Amendment 1561 #
Proposal for a regulation
Article 26 – paragraph 1 – point a a (new)
(aa) the funding of illegal content, including models based on advertisement
2021/07/08
Committee: IMCO
Amendment 1568 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for human dignity, private and family life, freedom of expression and information, right to property, the prohibition of discrimination and the rights of the child, as enshrined in Articles 1, 7, 11, 17, 21 and 24 of the Charter respectively;
2021/07/08
Committee: IMCO
Amendment 1665 #
Proposal for a regulation
Article 28 – paragraph 2 – point a
(a) are independent from the very large online platform concerned and have not provided any other service to the platform in the previous 12 months;
2021/07/08
Committee: IMCO
Amendment 1714 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available to relevant authorities, publishers, advertisers and vetted researchers that meet the requirements listed in paragraph 4 of this Article or Article 31 through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/08
Committee: IMCO
Amendment 1720 #
Proposal for a regulation
Article 30 – paragraph 2 – point a
(a) the content of the advertisement, in particular, the name of the product, service or brand and the object of the advertisement;
2021/07/08
Committee: IMCO
Amendment 1733 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
(e) the total number of recipients of the service reached in each country and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically.
2021/07/08
Committee: IMCO
Amendment 1736 #
Proposal for a regulation
Article 30 – paragraph 2 a (new)
2a. When very large online platforms sell advertising for display on their online interface, the contract signed with the buyer or the buyer’s representative includes a clause providing that the platform guarantees that no content adjacent to the advertisement is incompatible with the terms and conditions of the platform or with the law of the Member States of residence of the recipients of the service to whom the advertisement will be displayed. Any clause to the contrary shall be null and void.
2021/07/08
Committee: IMCO
Amendment 1742 #
Proposal for a regulation
Article 30 – paragraph 2 b (new)
2b. Very large online platforms that display advertising on their online interfaces shall conduct at their own expense, upon the request of advertisers and publishers, independent audits performed by organisations complying with the criteria set in Article 28(2), on a reasonable frequency, under fair and proportionate conditions agreed upon platforms, advertisers and publishers, to: (a) conduct a quantitative and qualitative assessment of cases where advertising is associated with illegal content; (b) detect fraudulent use of their services to fund illegal activities; (c) assess the performance of their tools in terms of brand safety The report shall include an audit opinion on the performance of their tools in terms of brand safety, either positive, positive with comments or negative and where the audit opinion in not positive, operational recommendations on specific measures to achieve compliance. These platforms shall make available to advertisers and publishers, upon their request, the results of that audit.
2021/07/08
Committee: IMCO
Amendment 1880 #
Proposal for a regulation
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30, but also to further transparency between all the players involved in the programmatic advertising value chain.
2021/07/08
Committee: IMCO
Amendment 1887 #
Proposal for a regulation
Article 36 – paragraph 2 – point b a (new)
(ba) the set-up of a common or unique identifier constituted by multiple elements (such as the advertiser identifier and references to the brand of the campaign, its product, and the reference of the purchase) which enables advertisers and publishers to identify and track a campaign throughout its lifecycle.
2021/07/08
Committee: IMCO