BETA

22 Amendments of Peter LUNDGREN related to 2020/0361(COD)

Amendment 10 #
Proposal for a regulation
Recital 2
(2) Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice.deleted
2021/06/01
Committee: TRAN
Amendment 26 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location .
2021/06/01
Committee: TRAN
Amendment 31 #
Proposal for a regulation
Recital 44
(44) Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court and in in-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost- effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned.
2021/06/01
Committee: TRAN
Amendment 32 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner and have a long history of unpartisan behavior. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi- public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council.43 _________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53
2021/06/01
Committee: TRAN
Amendment 34 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Users and material should never be deleted in an automatic way due to notices and complaints. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/06/01
Committee: TRAN
Amendment 41 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, especially the basic right to an account for all legal users, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result.
2021/06/01
Committee: TRAN
Amendment 54 #
Proposal for a regulation
Recital 68
(68) It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation.deleted
2021/06/01
Committee: TRAN
Amendment 58 #
Proposal for a regulation
Recital 71
(71) In case of extraordinary circumstances affecting public security or public health, the Commissionservice providers may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross- border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content.
2021/06/01
Committee: TRAN
Amendment 61 #
Proposal for a regulation
Recital 88
(88) In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State.deleted
2021/06/01
Committee: TRAN
Amendment 62 #
Proposal for a regulation
Recital 89
(89) The Board should contribute to achieving a common Union perspective on the consistent application of this Regulation and to cooperation among competent authorities, including by advising the Commission and the Digital Services Coordinators about appropriate investigation and enforcement measures, in particular vis à vis very large online platforms. The Board should also contribute to the drafting of relevant templates and codes of conduct and analyse emerging general trends in the development of digital services in the Union.deleted
2021/06/01
Committee: TRAN
Amendment 63 #
Proposal for a regulation
Recital 90
(90) For that purpose, the Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation.deleted
2021/06/01
Committee: TRAN
Amendment 64 #
Proposal for a regulation
Recital 91
(91) The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross- cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks.deleted
2021/06/01
Committee: TRAN
Amendment 65 #
Proposal for a regulation
Recital 92
(92) The Commission, through the Chair, should participate in the Board without voting rights. Through the Chair, the Commission should ensure that the agenda of the meetings is set in accordance with the requests of the members of the Board as laid down in the rules of procedure and in compliance with the duties of the Board laid down in this Regulation.deleted
2021/06/01
Committee: TRAN
Amendment 66 #
Proposal for a regulation
Recital 93
(93) In view of the need to ensure support for the Board’s activities, the Board should be able to rely on the expertise and human resources of the Commission and of the competent national authorities. The specific operational arrangements for the internal functioning of the Board should be further specified in the rules of procedure of the Board.deleted
2021/06/01
Committee: TRAN
Amendment 67 #
Proposal for a regulation
Recital 95
(95) In order to address those public policy concerns it is therefore necessary to provide for a common system of enhanced supervision and enforcement at Union level. Once an infringement of one of the provisions that solely apply to very large online platforms has been identified, for instance pursuant to individual or joint investigations, auditing or complaints, the Digital Services Coordinator of establishment, upon its own initiative or upon the Board’s advice, should monitor any subsequent measure taken by the very large online platform concerned as set out in its action plan. That Digital Services Coordinator should be able to ask, where appropriate, for an additional, specific audit to be carried out, on a voluntary basis, to establish whether those measures are sufficient to address the infringement. At the end of that procedure, it should inform the Board, the Commission and the platform concerned of its views on whether or not that platform addressed the infringement, specifying in particular the relevant conduct and its assessment of any measures taken. The Digital Services Coordinator should perform its role under this common system in a timely manner and taking utmost account of any opinions and other advice of the Board.deleted
2021/06/01
Committee: TRAN
Amendment 75 #
Proposal for a regulation
Article 1 – paragraph 5 – point e
(e) Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings [e-evidence once adopted]deleted
2021/06/01
Committee: TRAN
Amendment 203 #
Proposal for a regulation
Article 25 a (new)
Article 25 a The right to an account 1. Very large platforms shall have the obligation of providing the ability for every user with legal intentions to create an account. The user shall be able to verify its account. 2. Very large platforms shall only be able to delete an account if the user uses the account for illegal activities like abuse, fraud or terrorist propaganda.
2021/06/01
Committee: TRAN
Amendment 204 #
Proposal for a regulation
Article 25 b (new)
Article 25 b Content moderation 1. Only illegal content may be deleted on a very large platform by the provider. 2. Users may block and delete content on the section of platform of which it controls, for example comments on its posts.
2021/06/01
Committee: TRAN
Amendment 205 #
Proposal for a regulation
Article 25 c (new)
Article 25 c Shadow banning The practice of shadow banning which means that the user can still use and post on the very large platform but the spread is significantly reduced shall be prohibited.
2021/06/01
Committee: TRAN
Amendment 206 #
Proposal for a regulation
Article 25 d (new)
Article 25 d Algorithms and political views The very large platforms algorithm which selects content to be shown shall never disadvantage one side of the political debate.
2021/06/01
Committee: TRAN
Amendment 214 #
Proposal for a regulation
Article 35
1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. 2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. 3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. 4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. 5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.Article 35 deleted Codes of conduct
2021/06/01
Committee: TRAN
Amendment 222 #
Proposal for a regulation
Article 37
[...]deleted
2021/06/01
Committee: TRAN