BETA

Activities of Jorge BUXADÉ VILLALBA related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (14)

Amendment 220 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditionsadopt measures to limit the dissemination of illegal content. They may also include correctiveinformation measures, such as discontinuing advertising revenuidentifying the for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, andigin and means of financing of information sources. Very large online platforms may cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/06/10
Committee: LIBE
Amendment 273 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected, especially the right to freedom of expression and opinion.
2021/06/10
Committee: LIBE
Amendment 294 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
(p) 'content moderation' means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditionsin the Member State in which the service is rendered, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
2021/06/10
Committee: LIBE
Amendment 297 #
Proposal for a regulation
Article 3 – paragraph 1 – point c
(c) does not select or modify the information contained in the transmission, which includes moderation of non- illegal content in the Member State in which the service is rendered.
2021/06/10
Committee: LIBE
Amendment 303 #
Proposal for a regulation
Article 4 – paragraph 1 – point a
(a) the provider does not modify the information, which includes moderation of non-illegal content in the Member State in which the service is rendered;
2021/06/10
Committee: LIBE
Amendment 334 #
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
2021/06/10
Committee: LIBE
Amendment 362 #
Proposal for a regulation
Article 9 – paragraph 1
1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order.
2021/06/10
Committee: LIBE
Amendment 389 #
Proposal for a regulation
Article 10 – paragraph 1
1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 47 for the application of this Regulation.
2021/06/10
Committee: LIBE
Amendment 391 #
Proposal for a regulation
Article 11 – paragraph 2
2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource to cooperate with the Member States’ authorities, the Commission and the Board and comply with those decisions.
2021/06/10
Committee: LIBE
Amendment 439 #
Proposal for a regulation
Article 14 – paragraph 2 – point a
(a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content, which should include the specific legal provision infringed in the Member State in which the service is rendered;
2021/06/10
Committee: LIBE
Amendment 502 #
Proposal for a regulation
Article 17 – paragraph 3
3. Online platforms shall handle, within a maximum of 24 hours, complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay.
2021/06/10
Committee: LIBE
Amendment 506 #
Proposal for a regulation
Article 17 – paragraph 4
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. If within 48 hours of the complaint lodged by the service recipient the complaint has not been resolved, it shall be deemed to have been accepted.
2021/06/10
Committee: LIBE
Amendment 661 #
Proposal for a regulation
Article 27 – paragraph 1 a (new)
1a. During election periods, such platforms may not, under any circumstances, moderate non-illegal content in the Member State in which the service is rendered.
2021/06/10
Committee: LIBE
Amendment 788 #
Proposal for a regulation
Article 37 – paragraph 4 – point e
(e) safeguards to addressprevent any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non- discrimination;
2021/06/10
Committee: LIBE