BETA

Activities of Joachim Stanisław BRUDZIŃSKI related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (42)

Amendment 140 #
Proposal for a regulation
Recital 12
(12) In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and alsunderpin the general idea that what is illegal offline should also be illegal online. The concept should be defined broadly to covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non- consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question.
2021/06/10
Committee: LIBE
Amendment 155 #
Proposal for a regulation
Recital 20
(20) A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally andor the main purpose of which is to engage in or facilitate such activities should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
2021/06/10
Committee: LIBE
Amendment 207 #
Proposal for a regulation
Recital 52
(52) Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The parameters shall include, if applicable, the optimisation goal selected by the advertiser, information on the use of custom lists and in such case – the category and source of personal data uploaded to the online platform and the legal basis for uploading this personal data pursuant to Regulation (EU) 2016/679, information on the use of lookalike audiences and in such case – relevant information on the seed audience and an explanation why the recipient of the advertisement has been determined to be part of the lookalike audience, meaningful information about the online platform’s algorithms or other tools used to optimise the delivery of the advertisement, including a specification of the optimisation goal and a meaningful explanation of reasons why the online platform has decided that the optimisation goal can be achieved by displaying the advertisement to this recipient. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein.
2021/06/10
Committee: LIBE
Amendment 224 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms mayshould reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They mayshould also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/06/10
Committee: LIBE
Amendment 255 #
(76) In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State's jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. In addition in order to ensure effective protection of fundamental rights of EU citizens that take into account diverse national law sand difference in socio-cultural context between countries, a Member State shall exercise jurisdiction where it concerns very large online platforms which offer services to a significant number of recipients in a given Member State. Member States jurisdiction is particularly important in case of very large online platforms which are social media because they play a central role in facilitating the public debate
2021/06/10
Committee: LIBE
Amendment 262 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non- discrimination, data protection, competition, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.
2021/06/10
Committee: LIBE
Amendment 330 #
Proposal for a regulation
Article 8 – paragraph 1
1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item or multiple items of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.
2021/06/10
Committee: LIBE
Amendment 356 #
Proposal for a regulation
Article 8 – paragraph 3 a (new)
3 a. The Digital Services Coordinator of each Member State, on its own initiative, within 72 hours of receiving the copy of the order to act, has the right to scrutinise the order to determine whether it seriously or manifestly infringes the respective Member State’s law and revoke the order on its own territory.
2021/06/10
Committee: LIBE
Amendment 390 #
Proposal for a regulation
Article 10 – paragraph 2
2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact, including postal address, and ensure that that information is up to date. Providers of intermediary services shall notify that information, including the name, postal address, the electronic mail address and telephone number, of their single point of contact, to the Digital Service Coordinator in the Member State where they are established.
2021/06/10
Committee: LIBE
Amendment 392 #
Proposal for a regulation
Article 11 – paragraph 4
4. Providers of intermediary services shall notify valid identification data, including the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date.
2021/06/10
Committee: LIBE
Amendment 394 #
Proposal for a regulation
Article 11 – paragraph 5 a (new)
5 a. Very large online platform defined in art. 25, at the request of the Digital Services Coordinator of the Member States where this provider offers its services, shall designate a legal representative to be bound to obligations laid down in this article
2021/06/10
Committee: LIBE
Amendment 398 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear plain, intelligible and unambiguous language and shall be publicly available in an easily accessible format.
2021/06/10
Committee: LIBE
Amendment 407 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2 a. Very large online platforms as defined in article 25, should publish their terms and conditions in all official languages of the Union.
2021/06/10
Committee: LIBE
Amendment 408 #
Proposal for a regulation
Article 12 – paragraph 2 b (new)
2 b. The Digital Services Coordinator of each Member State has the right to request very large online platforms, to apply measures and tools of content moderation, including algorithmic decision-making and human review reflecting Member State’s socio-cultural context. Framework for this cooperation as well as specific measures thereof may be laid down in national legislation and be notified to the European Commission.
2021/06/10
Committee: LIBE
Amendment 410 #
Proposal for a regulation
Article 12 – paragraph 2 c (new)
2 c. Notwithstanding the right in article 12(3), the Digital Services Coordinator of each Member State, by means of national legislation, may seek to request from a very large online platform to cooperate with the Digital Services Coordinator of the Member State in question in handling specific legal content removal cases in which there is reason to believe that Member State’s socio-cultural context may have played a vital role.
2021/06/10
Committee: LIBE
Amendment 420 #
Proposal for a regulation
Article 13 – paragraph 1 – point b
(b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action;
2021/06/10
Committee: LIBE
Amendment 423 #
Proposal for a regulation
Article 13 – paragraph 1 – point d
(d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed.
2021/06/10
Committee: LIBE
Amendment 428 #
Proposal for a regulation
Article 13 – paragraph 2 a (new)
2 a. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down specific templates of reports specified in paragraph 1.
2021/06/10
Committee: LIBE
Amendment 440 #
Proposal for a regulation
Article 14 – paragraph 2 – point b
(b) a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, and applicable additional information enabling the identification of the illegal content which shall be appropriate to the type of content and to the specific type of intermediary;
2021/06/10
Committee: LIBE
Amendment 486 #
Proposal for a regulation
Article 16 – paragraph 1
This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which do not engage in illegal activity.
2021/06/10
Committee: LIBE
Amendment 495 #
Proposal for a regulation
Article 17 – paragraph 1 – point c a (new)
(c a) any other decisions that affect the availability, visibility or accessibility of that content and the recipient’s account or the recipient’s access to significant features of the platform’s regular services
2021/06/10
Committee: LIBE
Amendment 505 #
Proposal for a regulation
Article 17 – paragraph 4
4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. This feedback shall also include: - information on whether the decision referred to in paragraph 1 was taken as a result of human review or through automated means - in case the decision referred to in paragraph 1 is tobe sustained, detailed explanation on how the information to which the complaint relates is in breach of the platform’s terms and conditions or why the online platform finds the information unlawful.
2021/06/10
Committee: LIBE
Amendment 508 #
Proposal for a regulation
Article 17 – paragraph 5
5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. Complainants shall have the right to request human review and consultation with relevant online platforms’ staff with respect to content to which the complaint relates to.
2021/06/10
Committee: LIBE
Amendment 511 #
Proposal for a regulation
Article 17 – paragraph 5 a (new)
5 a. Recipients of the service negatively affected by the decision of an online platform shall have the possibility to seek swift judicial redress in accordance with the laws of the Member States concerned. The procedure shall ensure that an independent judicial body decides on the matter without undue delay, resolving the case no later than within 14 days while granting then negatively affected party the right to seek interim measures to be imposed within 48 hours since the recourse is brought before this body. The right to seek a judicial redress and interim measures will not be limited or conditioned on exhausting the internal complaint-handling system.
2021/06/10
Committee: LIBE
Amendment 710 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in or any otheir systerms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used used to determine the order of presentation of content, including their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameat which decrease the visibility of content, shall set out in their terms that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in these systems
2021/06/10
Committee: LIBE
Amendment 715 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
1 a. 2.The main parameters referred to in paragraph1 shall include, at minimum: (a) the main criteria used by the relevant recommender system, (b) how these criteria are weighted against each other, (c)the optimisation goal of the relevant recommender system, (d) explanation of the role that the behaviour of the recipients of the service plays in how the relevant recommender system functions.
2021/06/10
Committee: LIBE
Amendment 717 #
Proposal for a regulation
Article 29 – paragraph 1 b (new)
1 b. 3. Very large online platforms shall provide options for the recipients of the service to modify or influence parameters referred to in paragraph 2, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679
2021/06/10
Committee: LIBE
Amendment 720 #
Proposal for a regulation
Article 29 – paragraph 2
2. Where several options are available pursuant to paragraph 1, vVery large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service: a) to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them, b) to select third party recommender systems.
2021/06/10
Committee: LIBE
Amendment 775 #
Proposal for a regulation
Article 35 – paragraph 1
1. The Commission and the Board shall encouragehave the right to request and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data.
2021/06/10
Committee: LIBE
Amendment 777 #
Proposal for a regulation
Article 35 – paragraph 2
2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may inviteshall request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.
2021/06/10
Committee: LIBE
Amendment 796 #
Proposal for a regulation
Article 40 – paragraph 3 a (new)
3 a. 4: Member States shall exercise jurisdiction for the purposes of Chapters III and IV of this Regulation where it concerns very large online platforms, as defined in art. 25, which offer services to a significant number of active recipients of the service in a given Member State, which can be calculated on the basis of art. 23(2).
2021/06/10
Committee: LIBE
Amendment 805 #
Proposal for a regulation
Article 43 – paragraph 1
Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Assessment of the complaint can be supplemented by the opinion of Digital Services Coordinator of the Member State, where the recipient resides or is established, on how the matter should be resolved taking into account national law and socio-cultural context of a given Member State. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority.
2021/06/10
Committee: LIBE
Amendment 809 #
Proposal for a regulation
Article 43 – paragraph 1 a (new)
Pursuant to paragraph 1 the Digital Services Coordinator of establishment in cases concerning complaint transmitted by the Digital Services Coordinator of the Member State where the recipient resides or is established, should assess the matter in a timely manner and should inform the Digital Services Coordinator of the Member State where the recipient resides or is established, on how the complaint has been handled.
2021/06/10
Committee: LIBE
Amendment 813 #
Proposal for a regulation
Article 45 – paragraph 1 a (new)
1 a. A request or recommendation pursuant to paragraph 1 should not preclude the possibility of the Digital Services Coordinator of the Member State where the recipient of the service resides or is established, to be able to carry out its own investigation concerning suspected infringement of this regulation by a provider of an intermediary service.
2021/06/10
Committee: LIBE
Amendment 814 #
Proposal for a regulation
Article 45 – paragraph 2 a (new)
2 a. A recommendation pursuant to paragraph 1 and 2 may additionally indicate: a) an opinion on matters that involve taking into account national law and socio-cultural context; b) a draft decision based on investigation pursuant to paragraph1a
2021/06/10
Committee: LIBE
Amendment 818 #
Proposal for a regulation
Article 45 – paragraph 7
7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information should be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1.
2021/06/10
Committee: LIBE
Amendment 834 #
Proposal for a regulation
Article 48 – paragraph 6
6. The Board shall adopt its rules of procedure, following the consent of and inform the Commission thereof.
2021/06/10
Committee: LIBE
Amendment 836 #
Proposal for a regulation
Article 49 – paragraph 1 – point d
(d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measuradopt opinions on issues concerning very large online platforms in accordance with this Regulation;
2021/06/10
Committee: LIBE
Amendment 837 #
Proposal for a regulation
Article 49 – paragraph 1 – point e a (new)
(e a) (f) issue opinions, recommendations or advice on matters related to Article 34.
2021/06/10
Committee: LIBE
Amendment 856 #
Proposal for a regulation
Article 52 – paragraph 1
1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, their legal representatives, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period.
2021/06/10
Committee: LIBE
Amendment 908 #
Proposal for a regulation
Article 73 – paragraph 1
1. By fivthree years after the entry into force of this Regulation at the latest, and every fivthree years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation.
2021/06/10
Committee: LIBE
Amendment 909 #
Proposal for a regulation
Article 73 – paragraph 4
4. By three years from the date of application of this Regulation at the latest, the Commission, after consulting the Board, shall carry out an assessment of the functioning of the Board and shall report it to the European Parliament, the Council and the European Economic and Social Committee, taking into account the first years of application of the Regulation. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation with regard to the structure of the Board.deleted
2021/06/10
Committee: LIBE