Activities of Rainer WIELAND related to 2020/0361(COD)
Plenary speeches (2)
Digital Services Act (continuation of debate)
Digital Services Act (continuation of debate)
Amendments (80)
Amendment 104 #
Proposal for a regulation
Recital 8
Recital 8
(8) Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States canshould be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council27 . On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. _________________ 27 Regulation (EU) No 1215/2012 of the European Parliament and of the Council of 12 December 2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters (OJ L351, 20.12.2012, p.1).
Amendment 115 #
Proposal for a regulation
Recital 11
Recital 11
(11) It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected and are lex specialis, prevailing over this Regulation.
Amendment 123 #
Proposal for a regulation
Recital 12
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant, dangerous or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
Amendment 131 #
Proposal for a regulation
Recital 13
Recital 13
(13) Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of anotherthe principal service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher.
Amendment 141 #
Proposal for a regulation
Recital 18
Recital 18
(18) The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. A provider of intermediary services plays an active role when assistance is given to the recipient of the service, notably for the optimizing and the promotion of the content offered. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider.
Amendment 144 #
Proposal for a regulation
Recital 20
Recital 20
(20) AThe provider should not be able to benefit from exemptions from liability provided for in this Regulation where the main purpose is to engage in or facilitate illegal activities or where a provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities and does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
Amendment 153 #
Proposal for a regulation
Recital 23
Recital 23
(23) In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders on the platforms, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well- informed consumer.
Amendment 181 #
Proposal for a regulation
Recital 32
Recital 32
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, ois information, which should include the relevant email addresses, telephone numbers, IP addresses and other contact details necessary to ensure such compliance, should be available in respect of all types orders. Orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
Amendment 196 #
Proposal for a regulation
Recital 40
Recital 40
Amendment 204 #
Proposal for a regulation
Recital 41
Recital 41
Amendment 213 #
Proposal for a regulation
Recital 42
Recital 42
(42) Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
Amendment 218 #
Proposal for a regulation
Recital 43 a (new)
Recital 43 a (new)
(43 a) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide, based on its own assessment, whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
Amendment 219 #
Proposal for a regulation
Recital 43 b (new)
Recital 43 b (new)
(43 b) The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content.
Amendment 245 #
Proposal for a regulation
Recital 52
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However,Online advertising is a significant source of financing for many digital business models and an effective tool to reach new costumers, not least for small- and medium sized companies. However, there are some instances when online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition toBased on the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
Amendment 265 #
Proposal for a regulation
Recital 58
Recital 58
(58) Very large online platforms should deploy the necessary and proportionate means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
Amendment 277 #
Proposal for a regulation
Recital 62
Recital 62
(62) A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significantn impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient.
Amendment 279 #
Proposal for a regulation
Recital 63
Recital 63
(63) Advertising systems used by very large online platforms could pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned.
Amendment 282 #
Proposal for a regulation
Recital 64
Recital 64
(64) In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for providing information or compelling access to data from very large online platforms to vetted researchers. All requirements f where relevant to a research project. All requests for providing information or access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service.
Amendment 301 #
Proposal for a regulation
Recital 69
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan.
Amendment 305 #
Proposal for a regulation
Recital 70
Recital 70
(70) The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives.
Amendment 373 #
Proposal for a regulation
Article 2 – paragraph 1 – point d – indent 1
Article 2 – paragraph 1 – point d – indent 1
— a significant number of users in relation to their population in one or more Member States; or
Amendment 388 #
Proposal for a regulation
Article 2 – paragraph 1 – point h
Article 2 – paragraph 1 – point h
(h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of anotherthe principal service and, for objective and technical reasons cannot be used without that otherprincipal service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation.
Amendment 389 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
Article 2 – paragraph 1 – point h a (new)
(h a) ‘editorial platform’ means an intermediary service which is in connection with a press publication within the meaning of Article 2(4) of Directive (EU) 2019/790 or another editorial media service and which allows users to discuss topics generally covered by the relevant media or to comment editorial content and which is under the supervision of the editorial team of the publication or other editorial media.
Amendment 397 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
Article 2 – paragraph 1 – point i a (new)
(i a) 'live streaming platform services' means an information society service which main or one the main purposes is to give the public access to live broadcasted audio or video material and which it organises and promotes for profit-making purposes;
Amendment 400 #
Proposal for a regulation
Article 2 – paragraph 1 – point o
Article 2 – paragraph 1 – point o
(o) ‘recommender system’ means a fully or partially automated system, used by an very large online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed;
Amendment 419 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awareness, acts expeditiouslyaccording within the deadlines of Article 5Ia new when it comes to remove or toing or disableing access to the illegal content.
Amendment 421 #
Proposal for a regulation
Article 5 – paragraph 1 a (new)
Article 5 – paragraph 1 a (new)
Amendment 423 #
Proposal for a regulation
Article 5 – paragraph 2 a (new)
Article 5 – paragraph 2 a (new)
2 a. Paragraph 1 shall not apply when the main purpose of the information society service is to engage in or facilitate illegal activities or when the provider of the information society service deliberately collaborates with a recipient of the services in order to undertake illegal activities.
Amendment 425 #
Proposal for a regulation
Article 5 – paragraph 3
Article 5 – paragraph 3
3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders on the platform, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control.
Amendment 430 #
Proposal for a regulation
Article 5 a (new)
Article 5 a (new)
Article 5 a The exemptions from liability established in Articles 3, 4 and 5 shall not apply where the information society service plays an active role of such a kind as to give it knowledge of, or control over the information provided by the recipient of the service.
Amendment 503 #
Proposal for a regulation
Article 9 – paragraph 2 – point b
Article 9 – paragraph 2 – point b
(b) the order only requires the provider to provide information already collected for the purposes of providing the service and which lies within its control, including email addresses, telephone numbers, IP addresses and other contact details necessary to determine the compliance referred to in (a);
Amendment 525 #
Proposal for a regulation
Article 11 – paragraph 1
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. The Member States may require very large online platforms to designate a legal representative in their Member State.
Amendment 528 #
Proposal for a regulation
Article 11 – paragraph 2
Article 11 – paragraph 2
2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource to guarantee the proper and timely cooperateion with the Member States’ authorities, the Commission and the Board and comply with those decisions.
Amendment 531 #
Proposal for a regulation
Article 11 a (new)
Article 11 a (new)
Article 11 a Exclusions Articles 12 and 13 of Section 1,and the provisions of Section 2, and Section 3 of Chapter III shall not apply to: (a) editorial platforms within the meaning of Article 2(h a) of this Regulation; (b) online platforms that qualify as micro and medium-sized enterprises within the meaning of the Annex to Recommendation 2003/361/EC. (c) an intermediary service, except very large online platforms, where it would constitute a disproportionate burden in view of its size, the nature of its activity and the risk posed to users.
Amendment 537 #
Proposal for a regulation
Article 12 – paragraph 1
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions, which have to respect European and national law. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
Amendment 545 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
Article 12 – paragraph 2 a (new)
2a. Where very large online platforms within the meaning of Article 25 of this Regulation otherwise allow for the dissemination to the public of press publications within the meaning of Article 2(4) of Directive (EU) 2019/790, such platforms shall not remove, disable access to, suspend or otherwise interfere with such content or the related service or suspend or terminate the related account on the basis of the alleged incompatibility of such content with its terms and conditions, unless it is illegal content
Amendment 564 #
Proposal for a regulation
Article 13 – paragraph 1 – point a
Article 13 – paragraph 1 – point a
(a) the number of orders received from Member States’ authorities, categorised, where possible, by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders;.
Amendment 566 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action;
Amendment 570 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, where identifiable, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed.
Amendment 575 #
Proposal for a regulation
Article 13 – paragraph 2
Article 13 – paragraph 2
2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. In addition, paragraph 1 shall not apply to enterprises that previously qualified for the status of a small or microenterprise within the meaning of the Annex to Recommendation2003/361/EC during the twelve months following their loss of that status.
Amendment 586 #
Proposal for a regulation
Article 14
Article 14
Amendment 634 #
Proposal for a regulation
Article 15 – paragraph 4
Article 15 – paragraph 4
4. Providers of hostingaragraphs 2, 3 and 4 shall not apply to providers of intermediary services sthall publish the decisions and the statements of reast qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. In additions, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal datathose paragraphs shall not apply to enterprises that previously qualified for the status of a micro or small enterprise within the meaning of the Annex to Recommendation 2003/361/EC during the twelve months following their loss of that status.
Amendment 647 #
Proposal for a regulation
Article 16 a (new)
Article 16 a (new)
Amendment 680 #
Proposal for a regulation
Article 18 – paragraph 1 – introductory part
Article 18 – paragraph 1 – introductory part
1. RAfter internal complaint handling mechanisms are exhausted, recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of- court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
Amendment 710 #
Proposal for a regulation
Article 19 – paragraph 1
Article 19 – paragraph 1
1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delayimmediately processed without prejudice to the implementation of a complaint and redress mechanism.
Amendment 743 #
Proposal for a regulation
Article 19 – paragraph 7 a (new)
Article 19 – paragraph 7 a (new)
7a. Online platforms shall, where possible, provide trusted flaggers with access to technical means that help them detect illegal content on a large scale.
Amendment 760 #
(d) where identifiable, the intention of the recipient, individual, entity or complainant.
Amendment 781 #
Proposal for a regulation
Article 22 – paragraph 1 – introductory part
Article 22 – paragraph 1 – introductory part
1. Where an online platform allows consumers to conclude distance contracts with professional traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information:
Amendment 782 #
Proposal for a regulation
Article 22 – paragraph 1 – point c
Article 22 – paragraph 1 – point c
Amendment 822 #
Proposal for a regulation
Article 24
Article 24
Amendment 865 #
Proposal for a regulation
Article 26 – paragraph 1 – point b
Article 26 – paragraph 1 – point b
(b) any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, freedom and pluralism of the media, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively caused by an illegal activity;
Amendment 870 #
Proposal for a regulation
Article 26 – paragraph 1 – point c
Article 26 – paragraph 1 – point c
(c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative and illegal effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
Amendment 873 #
Proposal for a regulation
Article 26 – paragraph 2
Article 26 – paragraph 2
2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
Amendment 918 #
Proposal for a regulation
Article 28 – paragraph 1 – point b
Article 28 – paragraph 1 – point b
(b) any commitmentof voluntary measures undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37.
Amendment 931 #
Proposal for a regulation
Article 29 – paragraph 1
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the mainshall base the parameters used inof their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679 on Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (P2B) and set them out in their terms and conditions.
Amendment 935 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
Article 29 – paragraph 1 a (new)
1a. The parameters used in recommender systems shall always be fair and non-discriminatory.
Amendment 938 #
Proposal for a regulation
Article 29 – paragraph 2
Article 29 – paragraph 2
Amendment 946 #
Proposal for a regulation
Article 30 – paragraph 1
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one yearfor advertisements that have been seen by more than 5000 recipients of the service and until six months after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
Amendment 953 #
Proposal for a regulation
Article 30 – paragraph 2 – point d
Article 30 – paragraph 2 – point d
Amendment 954 #
Proposal for a regulation
Article 30 – paragraph 2 – point e
Article 30 – paragraph 2 – point e
Amendment 966 #
Proposal for a regulation
Article 31 – paragraph 1
Article 31 – paragraph 1
1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, provide information and access to data that are necessary to properly monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes.
Amendment 969 #
Proposal for a regulation
Article 31 – paragraph 2
Article 31 – paragraph 2
2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access toinformation and access to relevant data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1).
Amendment 973 #
Proposal for a regulation
Article 31 – paragraph 3
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 for a limited time and through online databases or application programming interfaces, as appropriate.
Amendment 1009 #
Proposal for a regulation
Article 34 – paragraph 1 – point f
Article 34 – paragraph 1 – point f
Amendment 1016 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) in relation to the dissemination of illegal content emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1019 #
Proposal for a regulation
Article 35 – paragraph 2
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested partierelevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
Amendment 1024 #
Proposal for a regulation
Article 35 – paragraph 3
Article 35 – paragraph 3
3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives in relation to the dissemination of illegal content, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested partiethe relevant stakeholders, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain.
Amendment 1032 #
Proposal for a regulation
Article 36 – paragraph 1
Article 36 – paragraph 1
1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 30 and 30Article 6 of Directive 2000/31/EC.
Amendment 1034 #
Proposal for a regulation
Article 36 – paragraph 2 – introductory part
Article 36 – paragraph 2 – introductory part
2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30.
Amendment 1035 #
Proposal for a regulation
Article 36 – paragraph 2 – point a
Article 36 – paragraph 2 – point a
Amendment 1036 #
Proposal for a regulation
Article 36 – paragraph 2 – point b
Article 36 – paragraph 2 – point b
Amendment 1044 #
Proposal for a regulation
Article 37
Article 37
Amendment 1058 #
Proposal for a regulation
Article 41 – paragraph 2 – point e
Article 41 – paragraph 2 – point e
(e) the power to proportionate adopt interim measures to avoid the risk of serious harm.
Amendment 1060 #
1. Member States shall lay down the rules on penalties including administrative fines applicable to infringements of this Regulation by providers of intermediary services under their jurisdiction and shall take all the necessary measures to ensure that they are properly and effectively implemented in accordance with Article 41.
Amendment 1061 #
Proposal for a regulation
Article 42 – paragraph 2
Article 42 – paragraph 2
2. Penalties shall be effective, proportionate and dissuasive. They shall take into particular account the interest of small scale providers and start ups and their economic viability. Member States shall notify the Commission of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them.
Amendment 1103 #
Proposal for a regulation
Article 47 – paragraph 2 – point a a (new)
Article 47 – paragraph 2 – point a a (new)
(aa) contributing to the effective application of Directive 2000/31/EC Article 3 to prevent fragmentation of the digital single market and the obligations of very large platforms of Article 5 of the Platform to Business Regulation 2019/1150
Amendment 1132 #
Proposal for a regulation
Article 55 – paragraph 1
Article 55 – paragraph 1
1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order proportionate interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement.
Amendment 1135 #
Proposal for a regulation
Article 57 – paragraph 1
Article 57 – paragraph 1
1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, and, where necessary access to its databases and algorithms.
Amendment 1138 #
Proposal for a regulation
Article 59 – paragraph 2 – introductory part
Article 59 – paragraph 2 – introductory part
2. The Commission may by decision and in compliance with the proportionality principle impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or negligently:
Amendment 1151 #
Proposal for a regulation
Article 74 – paragraph 2 – introductory part
Article 74 – paragraph 2 – introductory part
2. It shall apply from [date - threesix months after its entry into force].