BETA

Activities of Alfred SANT related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Shadow opinions (1)

OPINION on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
2021/10/29
Committee: ECON
Dossiers: 2020/0361(COD)
Documents: PDF(394 KB) DOC(264 KB)
Authors: [{'name': 'Mikuláš PEKSA', 'mepid': 197539}]

Amendments (58)

Amendment 130 #
Proposal for a regulation
Recital 1
(1) Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council25 , new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks, not least cybersecurity risks, and challenges, both for individual users and for society and the economy as a whole. _________________ 25Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce') (OJ L 178, 17.7.2000, p. 1).
2021/09/10
Committee: ECON
Amendment 131 #
Proposal for a regulation
Recital 1 a (new)
(1a) The digitalisation of European society and its economy is often leaving policy makers, corporations and citizens struggling to catch up. Furthermore, the accumulation of data is regularly creating an uneven competitive level on the market since this is being used as a tool to determine who enters and who exits the market.
2021/09/10
Committee: ECON
Amendment 137 #
Proposal for a regulation
Recital 4
(4) Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective, risk-based and proportionate mandatory rules should be established at Union level. This Regulation provides the right conditions and competitive settings for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and, fostering interoperability and assure the possibility for new entries to penetrate the market. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated.
2021/09/10
Committee: ECON
Amendment 175 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously and in good faith to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/09/10
Committee: ECON
Amendment 179 #
Proposal for a regulation
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, ist should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer.
2021/09/10
Committee: ECON
Amendment 184 #
Proposal for a regulation
Recital 27
(27) Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem that makes it harder for both policymakers to manage, as well as for new entrants to penetrate the market. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service.
2021/09/10
Committee: ECON
Amendment 199 #
Proposal for a regulation
Recital 35
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms holding a dominant position in the market and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should be obliged to comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices, safeguarding the competitive nature of the sector by assuring the possibility for new entrants to penetrate the market, and protecting fundamental rights online.
2021/09/10
Committee: ECON
Amendment 202 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact, that is free of charge, and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
2021/09/10
Committee: ECON
Amendment 209 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes.
2021/09/10
Committee: ECON
Amendment 243 #
Proposal for a regulation
Recital 54
(54) Very large online platforms may cause societal and economic risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative socioeconomic impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means.
2021/09/10
Committee: ECON
Amendment 244 #
Proposal for a regulation
Recital 55
(55) In view of the network effects characterising the platform economy, the user base of an online platform may quickly expand and reach the dimension of a very large online platform, with the related impact on the internal market, economic actors and consumers. This may be the case in the event of exponential growth experienced in short periods of time, or by a large global presence and turnover allowing the online platform to fully exploit network effects and economies of scale and of scope. A high annual turnover or market capitalisation can in particular be an indication of fast scalability in terms of user reach. In those cases, the Digital Services Coordinator should be able to request more frequent reporting from the platform on the user base to be able to timely identify the moment at which that platform should be designated as a very large online platform for the purposes of this Regulation.
2021/09/10
Committee: ECON
Amendment 246 #
Proposal for a regulation
Recital 56
(56) Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement at both European and national level, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures.
2021/09/10
Committee: ECON
Amendment 250 #
Proposal for a regulation
Recital 57
(57) Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant societal and economic systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions.
2021/09/10
Committee: ECON
Amendment 255 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, the competitive aspect of the economy, security to trade, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/09/10
Committee: ECON
Amendment 257 #
Proposal for a regulation
Recital 60 a (new)
(60a) Auditors of digital services, whether independent or not, need to have specific competences and expertise in the sector, technological and operational. They need as well to be knowledgeable in the social, economic and human rights issues involved, among others. Whether as SMEs or multinationals, extensions of existing accountancy and auditing, legal, and ICT consultancy or similar firms cannot be automatically assumed to have the required knowhow to qualify as auditors. Member States and the Commission are therefore (encouraged) to develop protocols -- following consultation with all actors involved -- by which to assess and accredit auditors of digital services, preferably according to clear rules devised on a Union basis, and thereby to establish registers of accredited auditors on a national and on a European level.
2021/09/10
Committee: ECON
Amendment 261 #
Proposal for a regulation
Recital 63
(63) Advertising systems used by very large online platforms pose particular risks not least at both economic and political levels, and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. VIn particular, the accumulation of personal data by online platforms is converted into massive commercial assets often used as a way to give an uneven advantage to certain economic players over others. Therefore, very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
2021/09/10
Committee: ECON
Amendment 265 #
Proposal for a regulation
Recital 65
(65) Given the complexity of the functioning of the systems deployed and the systemic risks they present to society and the economy, very large online platforms should appoint compliance officers, which should have the necessary qualifications to operationalise measures and monitor the compliance with this Regulation within the platform’s organisation. Very large online platforms should ensure that the compliance officer is involved, properly and in a timely manner, in all issues which relate to this Regulation. In view of the additional risks relating to their activities and their additional obligations under this Regulation, the other transparency requirements set out in this Regulation should be complemented by additional transparency requirements applicable specifically to very large online platforms, notably to report on the risk assessments performed and subsequent measures adopted as provided by this Regulation.
2021/09/10
Committee: ECON
Amendment 272 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society, the economy and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which from a microeconomic perspective are particularly harmful for vulnerable recipients of the service, such as children but which could also hamper the competitive aspect of the market. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.
2021/09/10
Committee: ECON
Amendment 276 #
Proposal for a regulation
Recital 71
(71) In case of extraordinary circumstances affecting public security, the economy of one or more Member States, or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross- border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.
2021/09/10
Committee: ECON
Amendment 282 #
Proposal for a regulation
Recital 77
(77) Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and, human resources and financial means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. Furthermore, the Digital Services Coordinator of each Member State should establish a structured working relationship with the National Competition Authorities as well as the Financial Regulatory Authorities working on their territory.
2021/09/10
Committee: ECON
Amendment 284 #
Proposal for a regulation
Recital 87
(87) In view of the particular challenges that may emerge in relation to assessing and ensuring a very large online platform’s compliance, for instance relating to the scale or complexity of a suspected infringement or the need for particular expertise or capabilities at Union level, Digital Services Coordinators should have the possibility to request, on a voluntary basis, assistance from the Commission or otherwise ask the Commission to intervene and exercise its investigatory and enforcement powers under this Regulation.
2021/09/10
Committee: ECON
Amendment 293 #
Proposal for a regulation
Recital 93 a (new)
(93a) However, the sector of digital services is a fast moving one in which Europe cannot afford Regulation that is lagging behind technological and operational innovations. Governance structures should remain fit for purpose, flexible and transparent. While ensuring accountability on the part of players in the sector, they themselves must remain accountable. Regulatory structures in which any one institution is granted powers so that it can operate as prosecution, jury and judge or seem like so, could easily create problems of checks and balances thereby stimulating more litigation; it could also be less flexible in dealing with innovation. Therefore the Board should, during the first five years of this Regulation entering into force, to carry out a continuous assessment of governance structures related to this Regulation and eventually to make recommendations for their improvement, their streamlining, and the consolidation of effective checks and balances mechanisms.
2021/09/10
Committee: ECON
Amendment 294 #
Proposal for a regulation
Recital 94
(94) Given the importance of very large online platforms, in view of their reach and impact, their failure to comply with the specific obligations applicable to them may affect a substantial number of recipients of the services across different Member States and may cause large societal and economic harms, while such failures may also be particularly complex to identify and address.
2021/09/10
Committee: ECON
Amendment 297 #
Proposal for a regulation
Recital 97
(97) The Commission should remain free to, on the basis of this Regulation and other relevant EU law, decide whether or not it wishes to intervenes in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary.
2021/09/10
Committee: ECON
Amendment 305 #
Proposal for a regulation
Article 1 – paragraph 2 – point a
(a) contribute to the proper functioning of the internal market for intermediary services and impacted economic actors;
2021/09/10
Committee: ECON
Amendment 330 #
Proposal for a regulation
Article 2 – paragraph 1 – point n
(n) ‘advertisement’ means information designed to promote the messagedirectly or indirectly promote or rank information, products or services of a legal or natural person, irrespective of whether to achieve commercial or non- commercial purposes, and displayed by an online platform on its online interface againsor parts thereof against direct or indirect remuneration specifically for promoting that information, product or service;
2021/09/10
Committee: ECON
Amendment 337 #
Proposal for a regulation
Article 2 – paragraph 1 – point q a (new)
(qa) Trusted Flagger means an economically and politically neutral entity representing collective interests that its expertise and competence is dedicated to detecting, identifying and notifying illegal content;
2021/09/10
Committee: ECON
Amendment 339 #
Proposal for a regulation
Article 4 – paragraph 1 – point e
(e) the provider acts expeditiously and in good faith to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.
2021/09/10
Committee: ECON
Amendment 342 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiously and in good faith to remove or to disable access to the illegal content.
2021/09/10
Committee: ECON
Amendment 353 #
Proposal for a regulation
Article 8 – paragraph 2 – point a – indent 3
— information about redress available to the provider of the service and to the recipient of the service who provided the content, which may be sought in the Member State of establishment of the provider of the service and/or in the Member State of establishment of the recipient of the service who provided the content;
2021/09/10
Committee: ECON
Amendment 355 #
Proposal for a regulation
Article 8 – paragraph 2 – point b a (new)
(ba) the order must clarify the neutrality and non-discriminatory approach of the decision;
2021/09/10
Committee: ECON
Amendment 371 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. ThaSuch restrictions shall in no way serve to provide selected economic actors with hidden competitive advantages. The relevant information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
2021/09/10
Committee: ECON
Amendment 396 #
Proposal for a regulation
Article 14 – paragraph 2 – point d
(d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete as well as the relationship, economic or otherwise, if any, the individual or entity has with the notified entity.
2021/09/10
Committee: ECON
Amendment 406 #
Proposal for a regulation
Article 15 – paragraph 2 – point d
(d) where the decision concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground including explanations in relation to the arguments submitted under Article 14 paragraph 2A, where relevant;
2021/09/10
Committee: ECON
Amendment 430 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means and are involving human settlement in the case of dispute or redress.
2021/09/10
Committee: ECON
Amendment 441 #
Proposal for a regulation
Article 19 – paragraph 2 – point b a (new)
(ba) it does not have any economic, social or political interest in the exit from the market of the reported entity;
2021/09/10
Committee: ECON
Amendment 478 #
Proposal for a regulation
Article 24 – paragraph 1 – introductory part
Online platforms that directly or indirectly display advertising on their online interfaces or parts thereof shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipientconsumer, in a clear, concise but meaningful, uniform and unambiguous manner and in real time:
2021/09/10
Committee: ECON
Amendment 479 #
Proposal for a regulation
Article 24 – paragraph 1 – point a
(a) that the information displayed is an advertisement and whether the advertisement is a result of an automated mechanism, such as an advertising exchange mechanism;
2021/09/10
Committee: ECON
Amendment 480 #
Proposal for a regulation
Article 24 – paragraph 1 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and who directly or indirectly finances the advertisement;
2021/09/10
Committee: ECON
Amendment 481 #
Proposal for a regulation
Article 24 – paragraph 1 – point b a (new)
(ba) whether the advertising is based on any form of targeting; and
2021/09/10
Committee: ECON
Amendment 482 #
Proposal for a regulation
Article 24 – paragraph 1 – point c
(c) meaningful, granular and specific information about the main parameters used to determine the recipient to whom the advertisement is displayedtarget and display the advertisement, which allows the consumer to determine why and how the advertisement in question was shown to him or her. This information shall include categories of data that targeted forms of advertising would use to address and categorise consumers and the data platforms share with advertisers for advertising targeting purposes.
2021/09/10
Committee: ECON
Amendment 488 #
Proposal for a regulation
Article 25 – paragraph 1
1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3, or if they exercise a dominant position over a specific market sector.
2021/09/10
Committee: ECON
Amendment 497 #
Proposal for a regulation
Article 26 – paragraph 1 – point a
(a) details on the dissemination of illegal content through their services and impacted jurisdictions;
2021/09/10
Committee: ECON
Amendment 501 #
Proposal for a regulation
Article 26 – paragraph 1 – point b a (new)
(ba) impact on the economy and the competitiveness of single Member States or the EU market as relevant;
2021/09/10
Committee: ECON
Amendment 508 #
Proposal for a regulation
Article 27 – paragraph 1 – point a a (new)
(aa) Those reports should be easily accessible and free of charge to the general public and include standardised, open data describing the systemic risks, including socioeconomic ones.
2021/09/10
Committee: ECON
Amendment 509 #
Proposal for a regulation
Article 27 – paragraph 1 – point c
(c) reinforcing the internal processes or supervision, not solely based on automated systems, of any of their activities in particular as regards detection of systemic risk;
2021/09/10
Committee: ECON
Amendment 512 #
Proposal for a regulation
Article 27 – paragraph 2 – point a
(a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33 and taking note of their real or likely economic and competitive consequences, if any;
2021/09/10
Committee: ECON
Amendment 532 #
Proposal for a regulation
Article 29 – paragraph 2 a (new)
2a. Very large online platforms shall offer users the choice of recommender systems from first and third party providers. Such third parties must be offered access to the same operating system, hardware or software features that are available or used in the provision by the very large online platform of its own recommender systems.
2021/09/10
Committee: ECON
Amendment 533 #
Proposal for a regulation
Article 29 – paragraph 2 b (new)
2b. Very large online platforms may only limit access to third party recommender systems temporarily in cases of demonstrable abuse by the third party provider or when justified by an immediate requirement to address technical problems such as a serious security vulnerability.
2021/09/10
Committee: ECON
Amendment 537 #
Proposal for a regulation
Article 30 – paragraph 2 – point b
(b) the natural or legal person on whose behalf the advertisement is displayed and who directly or indirectly financed the advertisement;
2021/09/10
Committee: ECON
Amendment 545 #
Proposal for a regulation
Article 33 – paragraph 1
1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every sixthree months.
2021/09/10
Committee: ECON
Amendment 546 #
Proposal for a regulation
Article 33 – paragraph 2 – point b a (new)
(ba) the impact any declared illegal content has on the market of single Member States and/or the EU as relevant;
2021/09/10
Committee: ECON
Amendment 553 #
Proposal for a regulation
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives especially in relation to the flow of illegal content, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.
2021/09/10
Committee: ECON
Amendment 556 #
Proposal for a regulation
Article 37 – paragraph 1
1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security, the economy, or public health.
2021/09/10
Committee: ECON
Amendment 557 #
Proposal for a regulation
Article 37 – paragraph 2 – introductory part
2. The Commission shall encourage and facilitate very large online platforms and, where appropriate, other online platforms, especially those exercising a dominant position, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures:
2021/09/10
Committee: ECON
Amendment 580 #
Proposal for a regulation
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure, following the consent of the Commission.deleted
2021/09/10
Committee: ECON
Amendment 586 #
Proposal for a regulation
Article 50 – paragraph 1 – subparagraph 1
The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it hasthere are reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period.
2021/09/10
Committee: ECON
Amendment 652 #
Proposal for a regulation
Article 73 – paragraph 1
1. By fivthree years after the entry into force of this Regulation at the latest, and every fivthree years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee.
2021/09/10
Committee: ECON