BETA

Activities of József SZÁJER related to 2020/2019(INL)

Shadow reports (1)

REPORT with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online
2020/10/05
Committee: JURI
Dossiers: 2020/2019(INL)
Documents: PDF(298 KB) DOC(122 KB)
Authors: [{'name': 'Tiemo WÖLKEN', 'mepid': 185619}]

Amendments (104)

Amendment 4 #
Motion for a resolution
Citation 7 a (new)
- having regard to the Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 25 May 2016 on Online Platforms and the Digital Single Market - Opportunities and Challenges for Europe (COM(2016)288),
2020/06/05
Committee: JURI
Amendment 5 #
Motion for a resolution
Citation 7 b (new)
- having regard to the Recommendation of the Commission of 1 March 2018 on measures to effectively tackle illegal content online (C(2018) 1177),
2020/06/05
Committee: JURI
Amendment 6 #
Motion for a resolution
Citation 7 c (new)
- having regard to the Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market,
2020/06/05
Committee: JURI
Amendment 7 #
Motion for a resolution
Citation 7 d (new)
- having regard to the Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down bylaw, regulation or administrative action in Member States concerning the provision of audiovisual media services,
2020/06/05
Committee: JURI
Amendment 8 #
Motion for a resolution
Citation 7 e (new)
- having regard to the Directive2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography,
2020/06/05
Committee: JURI
Amendment 9 #
Motion for a resolution
Citation 7 f (new)
- having regard to the Directive (EU) 2017/541/EU of the European Parliament and of the Council of 15 March 2017 on combating terrorism,
2020/06/05
Committee: JURI
Amendment 12 #
Motion for a resolution
Recital A
A. whereas digital services, being a cornerstone of the Union’s economy and the livelihood of a large number of its citizens, need to be regulated in a way that balances central concerns like respect for fundamental rights and other rights of citizens, with the need to support development and economic progress, taking into account the interests of users and all market participants, with particular regard to small businesses, SMEs and start-ups;
2020/06/05
Committee: JURI
Amendment 22 #
Motion for a resolution
Recital B a (new)
Ba. whereas digital services are used by the majority of Europeans on a daily basis, but are subject to an increasingly wide set of rules across the EU leading to significant fragmentation on the market and consequently legal uncertainty for European users and services operating cross-borders, combined with lack of regulatory control on key aspects of today's information environment;
2020/06/05
Committee: JURI
Amendment 48 #
Motion for a resolution
Recital F
F. whereas content hostertaing platforms may determine what content is shown to their users, thereby profoundly influencing the way we obtain and communicate information, to the point that content hosting platforms have de facto become public spaces in the digital sphere; whereas public spaces must be managed in a manner that respects fundamental rights and the civil lawthe rights of the users;
2020/06/05
Committee: JURI
Amendment 49 #
Motion for a resolution
Recital G
G. whereas upholding the law in the digital world does not only involve effective enforcement of rights, but also, in particular, ensuring access to justice for all; whereas delegation of the taking of decisions regarding the legality of content or of law enforcement powers to private companies can undermine the right to a fair trial and risks not to provide an effective remedy; whereas taking of decisions of digital service providers should be complemented by a fast-track legal procedure with adequate guarantees;
2020/06/05
Committee: JURI
Amendment 54 #
Motion for a resolution
Recital H
H. whereas content hosting platforms often employ automated content removal mechanisms that raise legitimate rule of law concerns, in particular when they are encouraged to employ such mechanisms pro-actively and voluntarily, resulting in content removal taking place without a clear legal basis, which is in contravention of Article 10 of the European Convention on Human Rights, stating that formalities, conditions, restrictions or penalties governing the exercise of freedom of expression and information must be prescribed by law;deleted
2020/06/05
Committee: JURI
Amendment 60 #
Motion for a resolution
Recital H a (new)
Ha. whereas automated content removal mechanisms of digital service providers should be proportionate, covering only those justified cases, where the benefits of removing content outweigh the potential disadvantages of keeping content online; whereas these procedures should be also transparent and their terms and conditions should be made known prior to the users would use the service;
2020/06/05
Committee: JURI
Amendment 81 #
Motion for a resolution
Paragraph 1
1. Requests that the Commission submit without undue delay a set of legislative proposals comprising a Digital Services Act with a wide material, personal and territorial scope, including the recommendations as set out in the Annex to this resolution; considers that, without prejudice to detailed aspects of the future legislative proposals, Article 114 of the Treaty on the Functioning of the European Union should be chosen as the legal basis;
2020/06/05
Committee: JURI
Amendment 86 #
Motion for a resolution
Paragraph 2
2. Proposes that the Digital Services Act include a regulation that establishes contractual rights as regards content management, lays down transparent, binding and uniform standards and procedures for content moderation, andprovide digital service providers with a clear and up-to-date innovation friendly regulatory framework, protect users when accessing digital services, guarantees accessible and independent recourse to judicial redress and ensure the necessary cooperation among Member States;
2020/06/05
Committee: JURI
Amendment 93 #
Motion for a resolution
Paragraph 2 a (new)
2a. Proposes that the Digital Services Act follow a sector and problem-specific approach and make a clear distinction between illegal and harmful content when elaborating the appropriate policy options;
2020/06/05
Committee: JURI
Amendment 96 #
Motion for a resolution
Paragraph 2 b (new)
2b. Underlines that any new framework established in the Digital Services Act should be manageable for small businesses, SMEs and start-ups and should therefore include proportionate obligations and clear safeguards for all sectors;
2020/06/05
Committee: JURI
Amendment 97 #
Motion for a resolution
Paragraph 2 c (new)
2c. Proposes that the Digital Services Act introduces enhanced transparency rules for social media platforms in order to disclose the funding and the power of interest groups behind those using the digital services in order to show who is legally responsible for the content;
2020/06/05
Committee: JURI
Amendment 98 #
Motion for a resolution
Paragraph 2 d (new)
2d. Proposes that the Digital Services Act set the obligation for digital service providers without a permanent establishment in the EU to designate a legal representative for the interest of users within the European Union and to make the contact information of this representative visible and accessible on its website;
2020/06/05
Committee: JURI
Amendment 99 #
Motion for a resolution
Paragraph 2 e (new)
2e. Underlines the importance that online platforms hosting or moderating content online should bear more responsibility for the content they host and should act in order to proactively prevent illegality;
2020/06/05
Committee: JURI
Amendment 105 #
Motion for a resolution
Paragraph 3
3. Considers that following the actions of digital service providers any final decision on the legality of user- generated content must be made by an independent judiciary and not a private commercial entity;
2020/06/05
Committee: JURI
Amendment 110 #
Motion for a resolution
Paragraph 4
4. Insists that the regulation must proscribe content moderation practices that are discriminatoryproportionate or unduly go beyond the purpose of protection under the law;
2020/06/05
Committee: JURI
Amendment 118 #
Motion for a resolution
Paragraph 5
5. Recommends the establishment of a European Agency tasked with monitoring and enforcing compliance with contractual rights as regards content management, auditing any algorithms used fornetwork of national authorities tasked with monitoring the practice of automated content moderationfiltering and curation, and imposing penalties for non-compliancereporting to the EU institutions;
2020/06/05
Committee: JURI
Amendment 129 #
Motion for a resolution
Paragraph 6
6. Suggests that content hosting platformdigital service providers regularly submit transparency reports to the European Agencynetwork of national authorities and the European Commission, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms publish, statistics and data related to the automated content filtering and their decisions on removing user- generated content on a publicly accessible database;
2020/06/05
Committee: JURI
Amendment 138 #
Motion for a resolution
Paragraph 7
7. RecommendConsiders the establishment of independent dispute settlement bodies in the Member States, tasked with settling disputes regarding content moderation;
2020/06/05
Committee: JURI
Amendment 149 #
Motion for a resolution
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Actdigital service providers to employ automated filtering mechanism that goes beyond the level of protection required by the law, however encourages digital service providers to employ such a mechanism in order to combat against illegal content online;
2020/06/05
Committee: JURI
Amendment 153 #
Motion for a resolution
Paragraph 9
9. Considers that the user-targeted amplification of content based on the views or positions presented in such content is one of a practice on which further most detrimental practices in the digital society, especially innitoring might be required therefore the Commission should pay attention to and analysis the impact of cases where the visibility of such content is increased on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements;
2020/06/05
Committee: JURI
Amendment 157 #
Motion for a resolution
Paragraph 10
10. Is of the view that the use of targeted advertising must be regulated more strictly in favour of less intrusive forms of advertising that do not require extensive tracking of user interaction with content;deleted
2020/06/05
Committee: JURI
Amendment 161 #
Motion for a resolution
Paragraph 10 a (new)
10a. Notes however that targeted advertising is currently ruled by the General Data Protection Regulation which as to be properly enforced in the Union before any new legislation in this field would be considered;
2020/06/05
Committee: JURI
Amendment 163 #
Motion for a resolution
Paragraph 11
11. Recommends, therefore, that the Digital Services Act set clear boundaries as regards the terms for accumulation of data for the purpose ofintroduces rules in order to enhance transparency related to targeted advertising, especially when data are tracked on third party websites;
2020/06/05
Committee: JURI
Amendment 170 #
Motion for a resolution
Paragraph 12
12. Calls on the Commission to assess the possibility of defining fair contractual conditions to facilitate data sharing with the aim of addressing imbalances in market power; suggests, to this end, to explore options to facilitate the interoperability and portability of data;deleted
2020/06/05
Committee: JURI
Amendment 192 #
Motion for a resolution
Paragraph 15 a (new)
15a. Suggests to create a common understanding on what constitutes false or misleading advertisement;
2020/06/05
Committee: JURI
Amendment 196 #
Motion for a resolution
Paragraph 16
16. Regrets the existing information asymmetry between content hosting platforms and public authorities and calls for a streamlined exchange of necessary informationCalls for a streamlined exchange of necessary information between digital service providers and public authorities;
2020/06/05
Committee: JURI
Amendment 207 #
Motion for a resolution
Paragraph 18
18. Strongly recommends that smart contracts include mechanisms that can halt their execution, in particular to take account of concerns of weaker parties and to ensure that the rights of creditors in insolvency and restructuring are respectedConsiders that necessary steps should be taken in order to ensure equality between the parties in case of smart contracts for which the Commission should examine the modalities;
2020/06/05
Committee: JURI
Amendment 212 #
Motion for a resolution
Subheading 5
Provisions regarding private international lawdeleted
2020/06/05
Committee: JURI
Amendment 213 #
Motion for a resolution
Paragraph 19
19. Considers that non-negotiabdele terms and conditions should neither prevent effective access to justice in Union courts nor disenfranchise Union citizens or businesses and that the status of access rights to data under private international law is uncertain and leads to disadvantages for Union citizens and businesses;d
2020/06/05
Committee: JURI
Amendment 216 #
Motion for a resolution
Paragraph 20
20. Emphasises the importance of ensuring that the use of digital services in the Union is fully governed by Union law under the jurisdiction of Union courts;deleted
2020/06/05
Committee: JURI
Amendment 219 #
Motion for a resolution
Paragraph 21
21. Concludes further that legislative solutions to these issues ought to be found at Union level if action at the international level does not seem feasible, or if there is a risk of such action taking too long to come to fruition;deleted
2020/06/05
Committee: JURI
Amendment 234 #
Motion for a resolution
Annex I – part A – introductory part – indent 6
- The proposal addresses the importance of fair implementation of the rights of users as regards interoperability and portability.deleted
2020/06/05
Committee: JURI
Amendment 235 #
Motion for a resolution
Annex I – part A – introductory part – indent 7
- The proposal addresraises the necessity for the proper regulation of civil and commercial law aspectsed for assessment in the field of distributed ledger technologies, including block chains and, in particular, smart contracts.
2020/06/05
Committee: JURI
Amendment 237 #
Motion for a resolution
Annex I – part A – introductory part – indent 8
- The proposal raises the importance of pbrivate international law rules that provide legal clarity and certainty with respect tonging clarity on the non-negotiable terms and conditions used by online platforms and, ensure the rights to access to data soand guarantee thate access to justice is appropriately guaranteed.
2020/06/05
Committee: JURI
Amendment 238 #
Motion for a resolution
Annex I – part A – part I – introductory part
The key elements of the proposals to be included in the Digital Services Act should beDigital Services Act should reflect among others the following elements of the proposals, on the basis of a proper public consultation and impact analysis:
2020/06/05
Committee: JURI
Amendment 239 #
Motion for a resolution
Annex I – part A – part I – section 1 –introductory part
A regulation ‘on contractual rights as regards content management’ and that contains the following elements:
2020/06/05
Committee: JURI
Amendment 241 #
Motion for a resolution
Annex I – part A – part I – section 1 –– indent 1 a (new)
- It should build upon the home state control principle, by updating its scope in light of the increasing convergence of user protection.
2020/06/05
Committee: JURI
Amendment 242 #
Motion for a resolution
Annex I – part A – part I – section 1 –– indent 1 b (new)
- It should make a clear distinction between illegal and harmful content when it comes to applying the appropriate policy options.
2020/06/05
Committee: JURI
Amendment 243 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 1 c (new)
- It should avoid extending its scope that would conflict with existing sectorial rules already in force such as the Copyright Directive or other existing European law in the media and audio- visual field.
2020/06/05
Committee: JURI
Amendment 244 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 2
- It should provide principles for content moderation, including as regards discriminatory content moderation practices.
2020/06/05
Committee: JURI
Amendment 250 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 3
- It should provide formal and procedural standards for a notice and action system by following a sector-specific approach.
2020/06/05
Committee: JURI
Amendment 254 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 4
- It should provide rules for an independent dispute settlement mechanism by respecting the national competences of the Member States.
2020/06/05
Committee: JURI
Amendment 258 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 5 a (new)
- It should provide rules regarding the responsibility of content hosting platforms for goods sold or advertised on them taking into account supporting activities for SMEs in order to minimize their burden when adapting to this responsibility.
2020/06/05
Committee: JURI
Amendment 264 #
Motion for a resolution
Annex I – part A – part I – section 2 – introductory part
A European Agency on Content Managementnetwork of national authorities should be established with the following main tasks:
2020/06/05
Committee: JURI
Amendment 267 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1
- regular auditmonitoring of the algorithms employed by content hosting platforms for the purpose of content moderation as well as curation;
2020/06/05
Committee: JURI
Amendment 269 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 1 a (new)
- regular monitoring the practice of automated content filtering and curation, and reporting to the EU institutions;
2020/06/05
Committee: JURI
Amendment 272 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 2
- regular review of the compliance of content hosting platforms with the Regulation and other provisions that form part of the Digital Services Act, in particular as regards the correct implementation of the standards for notice-and-action procedures and content moderation in their terms and conditions, on the basis of transparency reports provided by the content hosting platforms and the public database of decisions on removal of content to be established by the Digital Services Act;
2020/06/05
Committee: JURI
Amendment 275 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 3 a (new)
- cooperate and coordinate with the national authorities of Member States related to the implementation of the Digital Services Act.
2020/06/05
Committee: JURI
Amendment 279 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – introductory part
- imposing fines for non-compliance with the Digital Services Act. Fines should be set at up to 4% of the total worldwide annual turnover of the content hosting intermediary and take into account the platform’s overall compliance with the Digital Services Act. The fines should contribute to a special dedicated fund intended to finance the operating costs of the dispute settlement bodies described in the Regulation. Instances of non- compliance should include:reporting to the Commission detected non-compliance with the rules established by the Digital Services Act including publishing biannual reports on all of its activities.
2020/06/05
Committee: JURI
Amendment 282 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 1
- failure to implement the notice- and-action system provided for in the Regulation;deleted
2020/06/05
Committee: JURI
Amendment 287 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 2
- failure to provide transparent, accessible and non-discriminatory terms and conditions;deleted
2020/06/05
Committee: JURI
Amendment 292 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 3
- failure to provide access for the European Agency to content moderation and curation algorithms for review;deleted
2020/06/05
Committee: JURI
Amendment 299 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 4
- failure to submit transparency reports to the European Agency;deleted
2020/06/05
Committee: JURI
Amendment 306 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 5
- publishing biannual reports on all of its activities.deleted
2020/06/05
Committee: JURI
Amendment 309 #
Motion for a resolution
Annex I – part A – part I – section 3 –– introductory part
The Digital Services Act should contain provisions requiring content hosting platforms to regularly provide transparency reports to the AgencyCommission and the network of national authorities. Such reports should, in particular, include:
2020/06/05
Committee: JURI
Amendment 321 #
Motion for a resolution
Annex I – part A – part II – section 1 – introductory part
Measures regarding content curation, data and online advertisements in breach of fair contractual rights of users should include:
2020/06/05
Committee: JURI
Amendment 335 #
Motion for a resolution
Annex I – part A – part II – section 2
The path to fair implementation of the rights of users as regards interoperability and portability should include: - defining fair contractual conditions to facilitate data sharing with the aim of addressing imbalances in market power, in particular through the interoperability and portadeleted an assessment of the possibility of data.
2020/06/05
Committee: JURI
Amendment 337 #
Motion for a resolution
Annex I – part A – part II – section 2 – indent 1
- an assessment of the possibility of defining fair contractual conditions to facilitate data sharing with the aim of addressing imbalances in market power, in particular through the interoperability and portability of data.deleted
2020/06/05
Committee: JURI
Amendment 347 #
Motion for a resolution
Annex I – part A – part II – section 3 – indent 1
- measures ensuring that the proper legislative framework is in place for the development and deployment of digital services making use ofincluding distributed ledger technologies, including such as block chains, and in particular for smart contracts,
2020/06/05
Committee: JURI
Amendment 348 #
Motion for a resolution
Annex I – part A – part II – section 3 – indent 2
- measures ensuring that smart contracts are fitted with mechanisms that can halt their execution, in particular given concerns of the weaker party and in respect for the rights of creditors in insolvency and restructuring.deleted
2020/06/05
Committee: JURI
Amendment 350 #
Motion for a resolution
Annex I – part A – part II – section 3 – indent 2 a (new)
- measures to ensure equality between the parties in case of smart contracts, taking into account in particular the interest of small businesses and SMEs, for which the Commission should examine possible modalities.
2020/06/05
Committee: JURI
Amendment 351 #
Motion for a resolution
Annex I – part A – part II – section 4
The path to equitable privadeleted international law rules that do not deprive users of access to justice should: - non-negotiable terms and conditions do not include provisions regulating privclude measures ensuring thate international law matters to the detriment of access to justice, - private international law rules as regards data in a way that is not detrimental to Union subjects, - possible, be agreed in the appropriate international fora.clude measures clarifying build on multilateralism and, if
2020/06/05
Committee: JURI
Amendment 352 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 1
- include measures ensuring that non-negotiabdele terms and conditions do not include provisions regulating private international law matters to the detriment of access to justice,d
2020/06/05
Committee: JURI
Amendment 356 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 2
- include measures clarifying private international law rules as regards data in a way that is not detrimental to Union subjects,deleted
2020/06/05
Committee: JURI
Amendment 358 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 3
- build on multilateralism and, if possible, be agreed in the appropriate international fora.deleted
2020/06/05
Committee: JURI
Amendment 359 #
Motion for a resolution
Annex I – part A – part II – section 4– final part
Only where it proves impossible to achieve a solution based on multilateralism in reasonable time, should measures applied within the Union be proposed, in order to ensure that the use of digital services in the Union is fully governed by Union law under the jurisdiction of Union courts.deleted
2020/06/05
Committee: JURI
Amendment 360 #
Motion for a resolution
Annex I – part B – recital 1
(1) The terms and conditions that providers of information society servicedigital service providers apply in relations with users are often non- negotiable and can be unilaterally amended by those providers. Action at a legislative level is needed to put in place minimum standards for such terms and conditions, in particular as regards procedural standards for content management;
2020/06/05
Committee: JURI
Amendment 368 #
Motion for a resolution
Annex I – part B – recital 5
(5) Concerning relations with users, this Regulation should lay down minimum standards for the transparency and accountability of terms and conditions of content hosting platforms. Terms and conditions should include transparent, binding and uniform standards and procedures for content moderation, which should guarantee accessible and independent recourse to judicial redress.
2020/06/05
Committee: JURI
Amendment 369 #
Motion for a resolution
Annex I – part B – recital 6
(6) User-targeted amplification of content based on the views in such content is one of the most detrimental practices in the digital society, especially when such content is amplified on the basis of previous user interaction with other amplified content and with the purpose of optimising user profiles for targeted advertisements.deleted
2020/06/05
Committee: JURI
Amendment 376 #
Motion for a resolution
Annex I – part B – recital 8 a (new)
(8a) Far too many goods sold online do not follow safety standards. One way of ensuring that content hosting platforms perform due diligence checks of goods sold by it or through it is to make the platforms jointly and severally responsible together with the primary seller. This would not be unreasonable for the content hosting platforms given that they take a share of the proceeds. Special attention should be paid to enable small and medium sized platforms to perform these checks and any supporting activity such as standardisation should ensure that administrative burdens are kept to a minimum.
2020/06/05
Committee: JURI
Amendment 377 #
Motion for a resolution
Annex I – part B – recital 9
(9) This Regulation should not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante control of content.deleted
2020/06/05
Committee: JURI
Amendment 383 #
Motion for a resolution
Annex I – part B – recital 9 a (new)
(9a) This Regulation does not prevent platforms from using an automated content mechanism where necessary and justified, and in particular promotes the use of such mechanism in the case the illegal nature of the content has either been established by a court or it can be easily determined without contextualisation.
2020/06/05
Committee: JURI
Amendment 384 #
Motion for a resolution
Annex I – part B – recital 10
(10) This Regulation should also include provisions against discriminatory content moderation practices, especially when user-created content is removed based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social clasunjustified content moderation practices.
2020/06/05
Committee: JURI
Amendment 386 #
Motion for a resolution
Annex I – part B – recital 11
(11) The right to issue a notice pursuant to this Regulation should remain with any natural or legal person, including public bodies, to which content is provided through a website or application. A content hosting platform should, however, be able to block a user who repeatedly issues false notices from issuing further notices.
2020/06/05
Committee: JURI
Amendment 388 #
Motion for a resolution
Annex I – part B – recital 12
(12) After a notice has been issued, the uploader should be informed about it and in particular about the reason for the notice, be provided information about the procedure, including about appeal and referral to independent dispute settlement bodies, and about available remedies in the event of false notices. Such information should, however, not be given if the content hosting platform has been informed by public authorities about ongoing law enforcement investigations. In such case, it should be for the relevant authorities to inform the uploader about the issue of a notice, in accordance with applicable rules.deleted
2020/06/05
Committee: JURI
Amendment 391 #
Motion for a resolution
Annex I – part B – recital 14
(14) Given the immediate nature of content hosting and the often ephemeral purpose of content uploading, it is necessary to establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse. Such bodies should be competent to adjudicate disputes concerning the legality of user-uploaded content and the correct application of terms and conditionsrecourse, however such process should not prevent the user right of access to justice.
2020/06/05
Committee: JURI
Amendment 398 #
Motion for a resolution
Annex I – part B – recital 17
(17) As regards jurisdiction, the competent independent dispute settlement body should be that located in the Member State in which the content forming the subject of the dispute has been uploadedCircumstances on the basis of which jurisdiction should be established must be in the interests of the users, so that both the place where the content has been uploaded and downloaded shall be deemed to constitute a ground of jurisdiction.
2020/06/05
Committee: JURI
Amendment 399 #
Motion for a resolution
Annex I – part B – recital 18
(18) Whistleblowing helps to prevent breaches of law and detect threats or harm to the general interest that would otherwise remain undetected. Providing protection for whistleblowers plays an important role in protecting freedom of expression, media freedom and the public’s right to access information. Directive (EU) 2019/1937 should therefore apply to the relevant breaches of this Regulation. Accordingly, that Directive should be amended.deleted
2020/06/05
Committee: JURI
Amendment 401 #
Motion for a resolution
Annex I – part B – recital 20
(20) Since the objective of this Regulation, namely to establish a regulatory framework for contractual rights as regards content management in the Union, cannot be sufficiently achieved by the Member States but can rather, by reason of its scale and effects can , be better achieved at Union level, the Union may adopt measures, in accordance with the principle of subsidiarity as set out in Article 5 of the Treaty on European Union. In accordance with the principle of proportionality, as set out in that Article, this Regulation does not go beyond what is necessary in order to achieve that objective.
2020/06/05
Committee: JURI
Amendment 402 #
Motion for a resolution
Annex I – part B – recital 21
(21) Action at Union level as set out in this Regulation would be substantially enhanced with the establishment of a Union agency tasked with monitoring and ensuring compliance by content hosting platforms with the provisions of this Regulation. The Agency should review compliance with the standards laid down for content management on the basis of transparency reports and an audit of algorithms employed by content hosting platforms for the purpose of content management ‒deleted
2020/06/05
Committee: JURI
Amendment 409 #
Motion for a resolution
Annex I – part B – Article 1 – paragraph 1
The purpose of this Regulation is to contribute to the proper functioning of the internal market by laying down rules to ensure that fair contractual rights exist as regards content management andprovide digital services providers with a clear, uniform, and up-to-date innovation friendly regulatory framework in the Single Market, to provide independent dispute settlement mechanisms for disputes regarding content managementtect, enable, and empower users when accessing digital services and to ensure the necessary cooperation among Member States in order to have an oversight of digital service providers in the EU.
2020/06/05
Committee: JURI
Amendment 411 #
Motion for a resolution
Annex I – part B – Article 2 – paragraph 1
This Regulation applies to the management by content hosting platforms of content that isproviders offering digital service accessible on websites or through smart phone applications in the Union, irrespective of the place of establishment or registration, or principal place of business of the content hosting platform., in particular online platforms such as social media, search engines, online marketplaces or collaborative economy services
2020/06/05
Committee: JURI
Amendment 415 #
Motion for a resolution
Annex I – part B – Article 3 –point 1
(1) ‘content hosting platform’ means an provider of information society service within the meaning of point (b) of Article 1(1) of Directive (EU) 2015/1535 of the European Parliament and of the Council1 of whichs consisting of the storage of information provided by the recipient of the service at his or her request, within the maeaning or one of the main purposes is to allow signed-up or non- signed-up users to upload content for display on a website or applicatf Article 14 of Directive 2000/31/EC ,irrespective of its place of establishment, which directs its activities to users residing in the Union; __________________ 1 Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services (OJ L 241, 17.9.2015, p. 1).
2020/06/05
Committee: JURI
Amendment 417 #
Motion for a resolution
Annex I – part B – Article 3 –point 2
(2) 'illegal content' means any concept, idea, expression or information in any format such as text, images, audio and videoinformation which is not in compliance with Union law or the law of a Member State concerned;
2020/06/05
Committee: JURI
Amendment 426 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 1
1. Content management shall be conducted in a fair, lawful and transparent manner. Content management practices shall be appropriate, relevant and limiproportionated to what is necessary in relation to the purposes for which the content is managed.
2020/06/05
Committee: JURI
Amendment 428 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 2
2. Users shall not be subjected to discriminatory content moderation practices by the content hosting platforms, such as removal of user-generated content based on appearance, ethnic origin, gender, sexual orientation, religion or belief, disability, age, pregnancy or upbringing of children, language or social class.deleted
2020/06/05
Committee: JURI
Amendment 434 #
Motion for a resolution
Annex I – part B – Article 4 a (new)
Article 4a Voluntary action 1. Without prejudice to Articles 12-14 of Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce', a content hosting platform is not liable for any of the information that it stores, indexes, makes available or transmits simply by virtue of the fact that it has taken voluntary action in good faith, whether of an automated or anon- automated nature, to identify, remove, disable access to, or otherwise restrict information or activity that the service provider reasonably considers to be illegal or otherwise objectionable. 2. Where a content hosting platform takes voluntary action in accordance with Paragraph 1: (a) it shall not be taken to imply that, as a result of the voluntary action, the content hosting platform has knowledge of or control over the information which it transmits or stores; (b) nor shall it be taken to imply that, as a result of the voluntary action, the activity of the content hosting platform is not of a mere technical, automatic and passive nature. 3. Paragraphs 1 and 2 shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the content hosting platform to terminate or prevent an infringement.
2020/06/05
Committee: JURI
Amendment 435 #
Motion for a resolution
Annex I – part B – Article 4 a (new)
Article 4a Responsibility for goods 1. Any person procuring goods from a content hosting platform or through advertising on a platform shall have the right to pursue remedies against the platform if the person has pursued his or her remedies against the supplier but has failed to obtain the satisfaction to which he or she is entitled according to the law or the contract for the supply of goods. 2. The Commission should publish guidelines in particular for small and medium sized platforms in order to support them coping with their responsibility for goods and to ensure that administrative burdens are kept to a minimum. 3. A platform that has become liable according to this article shall have the right to be indemnified by the supplier.
2020/06/05
Committee: JURI
Amendment 437 #
Motion for a resolution
Annex I – part B – Article 4 b (new)
Article 4b Transparency obligation 1. Digital services actively hosting or moderating online content shall take the necessary measures in order to disclose the funding and the power of interest groups behind those using their services so that the person legally responsible and accountable should be identifiable. 2. Digital service providers without a permanent establishment in the EU shall designate a legal representative for user interest within the European Union and make the contact information of this representative visible and accessible on their websites.
2020/06/05
Committee: JURI
Amendment 441 #
Motion for a resolution
Annex I – part B – Article 5 – subparagraph 2
A content hosting platform may block a user who repeatedly issues false notices from issuing further notices.deleted
2020/06/05
Committee: JURI
Amendment 469 #
Motion for a resolution
Annex I – part B – Article 12 – title
Stay-updown principle
2020/06/05
Committee: JURI
Amendment 470 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1
Without prejudice to judicial or administrative orders regarding content online, content that has been the subject of a notice shall remain visible until a final decision has been taken regarding its removal or takedown.deleted
2020/06/05
Committee: JURI
Amendment 474 #
Motion for a resolution
Annex I – part B – Article 12 – paragraph 1 a (new)
Digital service providers should act expeditiously to make unavailable or remove illegal content that has been notified to them and make best efforts to prevent future uploads of the same content.
2020/06/05
Committee: JURI
Amendment 476 #
Motion for a resolution
Annex I – part B – Article 13 – paragraph 1
1. Member States shallmay establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse when decisions on content moderation are appealed against.
2020/06/05
Committee: JURI
Amendment 478 #
Motion for a resolution
Annex I – part B – Article 13 – paragraph 4
4. Content hosting platforms that enjoy a dominant position on the market shall contribute financially to the operating costs of the independent dispute settlement bodies through a dedicated fund.deleted
2020/06/05
Committee: JURI
Amendment 483 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 3
3. As regards jurisdiction, the competent independent dispute settlement body shall be that located in the Member State in which the content that is the subject of the dispute has been uploaded.deleted
2020/06/05
Committee: JURI
Amendment 485 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 3 a (new)
3a. Both the place where the content has been uploaded and accessed shall be deemed to constitute a ground of jurisdiction
2020/06/05
Committee: JURI
Amendment 488 #
Motion for a resolution
Annex I – part B – Article 17
Amendments to Directive (EU) 2019/1937 Directive (EU) 2019/1937 is amended as follows: (1) following point is added: “(xi) online content management;”; (2) following point is added: “K. Point (a)(xi) of Article 2(1) - online content management. Regulation [XXX] of the European Parliament and of the Council on contractual rights as regards content management.”.rticle 17 deleted in point (a) of Article 2(1), the in Part I of the Annex, the
2020/06/05
Committee: JURI