Activities of Karen MELCHIOR related to 2020/2019(INL)
Shadow reports (1)
REPORT with recommendations to the Commission on a Digital Services Act: adapting commercial and civil law rules for commercial entities operating online
Amendments (73)
Amendment 1 #
Motion for a resolution
Citation 3 a (new)
Citation 3 a (new)
- having regard to Directive 2013/11/EU of the European Parliament and of the Council of 21 May 2013 on alternative dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004, Regulation (EU) No 524/2013 of the European Parliament and of the Council of 21 May 2013 on online dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004 and Directive2009/22/EC (Regulation on consumer ODR) and Directive 2009/22/EC (Directive on consumer ADR), and Directive 2008/52/EC of the European Parliament and of the Council of 21 May 2008 on certain aspects of mediation in civil and commercial matters,
Amendment 3 #
Motion for a resolution
Citation 5 a (new)
Citation 5 a (new)
- having regard to the commitment of the European Commission President, Ms. Ursula von der Leyen, to upgrade the liability and safety rules for digital platforms, services and products, and complete the Digital Single Market via a Digital Services Act,
Amendment 10 #
Motion for a resolution
Citation 8
Citation 8
- having regard to Article 11 of the Charter of Fundamental Rights of the European Union and Article 10 of the European Convention on Human Rights,
Amendment 11 #
Motion for a resolution
Citation 8 a (new)
Citation 8 a (new)
- having regard to the 2007 Lugano Convention and the 1958 New York Convention,
Amendment 13 #
Draft opinion
Paragraph 1
Paragraph 1
1. Welcomes the “CPC Common Position COVID-19”3 issued by the Commission and the Consumer Protection Cooperation (CPC) authorities of the Member States on the most recent reported scams and unfair practices in relation to the COVID-19 outbreak; calls on all platforms to cooperate with the Commission and the competent authorities to better identify illegal practices, take down scams and asks the Commission to constantly review the common guidelines fortake legislative action on the placement and/or sale of items and services of a false, misleading or otherwise abusive content for consumers; believes such guidelines should not only seek to apply Union and national consumer law, but tothat the Commission should proactively seek to put in place the means to react to the crisis in the market rapidly; __________________ 3 European Commission / Consumer Protection Cooperation (CPC) Network, Common Position of CPC Authorities, “Stopping scams and tackling unfair business practices on online platforms in the context of the Coronavirus outbreak in the EU”.
Amendment 25 #
Motion for a resolution
Recital C
Recital C
C. whereas some businesses offering digital services enjoy, due to strong data- driven network effects, market dominance that makes it increasingly difficult for other players to compete and difficult for new businesses to even enter the market;
Amendment 30 #
Draft opinion
Paragraph 2
Paragraph 2
2. Welcomes efforts to bring transparency and accountability to advertising online and considers that further clarity and guidance is needed as regards professional diligence and obligations for platforms; believes that where advertisers and intermediaries are established in a third country, they should designate a legal representative, established in the Union, who can be held accountable for the content of advertisements, in order to for example allow for consumer redress in the case of false or misleading advertisements;
Amendment 33 #
Motion for a resolution
Recital D
Recital D
D. whereas ex-post competition law enforcement alone cannot effectively address the impact of the market dominance of certain online platforms on fair competition in the digital single market;
Amendment 41 #
Draft opinion
Paragraph 3
Paragraph 3
3. Asks the Commission to clarify what sanctions or other restrictions those advertisement intermediaries and platforms should be subject to if they knowingly accept false or misleading advertisements; believes that online platforms should actively monitor the advertisements shown on their sites, in order to ensure they do not profit from false or misleading advertisements, including from influencer marketing content which is not being disclosed as sponsored; underlines that advertisements for commercial products and services, and advertisements of a political or other nature are different in form and function and therefore should be subject to different guidelines and, but complementary, rules;
Amendment 52 #
Draft opinion
Paragraph 4
Paragraph 4
4. While recalling earlier efforts, asks the Commission to further review the practice of End User Licensing Agreements (EULAs) and to seek ways to ensure compliance with Union law, in order to allow greater and easier engagement for consumers, including in the choice of claus; considers that restrictions on the use of digital content and digital services that, while being imposed by the contractual freedom between right holders and service providers should be banned for not meeting the reasonable expectations of the consumer, protected under Directive (EU) 2019/770 on certain aspects concerning contracts for the supply of digital content and digital services; notes that EULAs are often accepted by users without reading them; moreover notes that when a EULA does allow for users to opt- out of clauses, platforms may require users to do so at each use;
Amendment 55 #
Motion for a resolution
Recital H
Recital H
H. whereas content hosting platforms often employ automated content removal mechanisms that raise legitimate rule of law concerns, in particular when they are encouraged to employ such mechanisms pro-actively and voluntarily, resulting in content removal taking place without a clear legal basis, which is in contravention ofautomated content removal mechanisms, employed by content hosting platforms, raise legal concerns, in particular as regards possible restrictions of freedom of expression and information, protected under Article 101 of the European Convention on Human Rights, stating that formalities, conditions, restrictions or penalties governing the exercise of freedom of expression and information must be prescribed by lawCharter of Fundamental Rights of the European Union;
Amendment 59 #
Draft opinion
Paragraph 5
Paragraph 5
5. Underlines that EULAs should always make the sharing of all data with third parties optional unless vital to the functioning of the services, establishing a high level of data protection and security; recommends that any data access remedy should be imposed only to tackle market failures, be incompliance with the GDPR, give consumers the right to object to data sharing and provide consumers with technical solutions to help them control and manage flows of their personal information and have means of redress; asks the Commission to ensure that consumers can still use a connected device for all its primary functions even if a consumer withdraws their consent to share non- operational data with the device manufacturer or third parties;
Amendment 65 #
Motion for a resolution
Recital I
Recital I
I. whereas the civil law regimes governing content hosting platforms’ practices in content moderation are based on certain sector-specific provisions at Union level as well as on laws passed by Member States atand national level, and there arewith notable differences in the obligations imposed on content hosting platforms and in the enforcement mechanisms of the various civil law regimes; whereas this situationand enforcement mechanisms deployed; whereas this situation creates a fragmented Digital Single Market and, therefore, requires a response at Union level;
Amendment 68 #
Motion for a resolution
Recital L
Recital L
L. whereas the choice of algorithmic logic behind such recommendation systems, comparison services, content curation or advertisement placements remains not solely but also at the discretion of the content hosting platforms with little possibility for public oversight, which raises accountability and transparency concerns;
Amendment 69 #
Draft opinion
Paragraph 6
Paragraph 6
6. Underlines that the Directive (EU) 2019/770 and Directive (EU) 2019/771 are still to be properly transposed and implemented; asks the Commission to take this into account beforewhen taking additional measures;
Amendment 72 #
Motion for a resolution
Recital O
Recital O
O. whereas the terms and conditions of platforms, which are non-negotiable, often indicate both applicable law and competent courts outside the Union, which represent an obstacle as regards access to justice; whereas the question of which private international law rules relate to rights to data is ambiguous in Union law as well Regulation (EU) No 1215/2012 on jurisdiction and the recognition and enforcement of judgments in civil and commercial matters lays down rules on jurisdiction; whereas Regulation (EU) 2016/679 on the protection of natural persons with regards to the processing of personal data and on the free movement of such data, clarifies the data subject’s right to private enforcement action directly against the controller or processor, regardless of whether the processing takes place in the Union or not and regardless whether the controller is established in the Union or not; whereas Article 79 of Regulation (EU) 2016/679 stipulates that proceedings shall be brought before the courts of the Member State in where the controller or processor has ian international lawestablishment or, alternatively where the data subject has his or her habitual residence;
Amendment 75 #
Draft opinion
Paragraph 7
Paragraph 7
7. Notes the rise of “smart contracts” based on distributed ledger technologies; asks the Commission to analyse if certain aspects of “smart contracts” should be clarified and if guidance should be given in order to ensure legal certainty for businesses and consumers; asks especially for the Commission to work to ensure that such contracts with consumers are valid and binding throughout the Union; that they meet the standards of consumer law, for exampleincluding the right of withdrawal under Directive 2011/83/EU4 ; and that they are not subject to national barriers to application, such as notarisation requirements; __________________ 4Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council (OJ L 304, 22.11.2011, p. 64).
Amendment 77 #
Motion for a resolution
Recital P a (new)
Recital P a (new)
Pa. whereas it is important to assess the possibility of tasking an existing or new European Agency, or European body, with the responsibility of ensuring a harmonised approach across the Union and address the new opportunities and challenges, in particular those of a cross- border nature, arising from ongoing technological developments.
Amendment 81 #
Draft opinion
Paragraph 8
Paragraph 8
8. Stresses that any future legislative proposals should seek to remove current and prevent potentially new barriers in the supply of digital services by online platforms; underlines, at the same time, that new Union obligations on platforms must be proportional and clear in nature in order to avoid unnecessary regulatory burdens or unnecessary restrictions and be guided by consumer protection and product safety goals; underlines the need to prevent gold-plating practices of Union legislation by Member States.
Amendment 94 #
Motion for a resolution
Paragraph 2 a (new)
Paragraph 2 a (new)
2a. Requests the Commission that the regulation includes a universal definition of ''dominant platforms'' and lay down its characteristics.
Amendment 112 #
Motion for a resolution
Paragraph 4
Paragraph 4
4. Insists that the regulation must proscribehibit content moderation practices that are discriminatory;
Amendment 116 #
Motion for a resolution
Paragraph 5
Paragraph 5
5. Recommends the establishment of a European Agency tasked with monitoring and enforcing compliance with contractual rights as regards content management, auditing any algorithms used for automated content moderation and curation, and impThe application of this regulation should be closely monitored by an existing or new European Agency, or European body, tasked, in particular, to ensure compliance by content hosting platforms with the provisions of this Regulation. The relevant Agency or European body should review compliance with the standards laid down for content management on the basis of transparency reports and an audit of algorithms employed by content hosting penalties for non- compliancelatforms for the purpose of content management;
Amendment 124 #
Motion for a resolution
Paragraph 5 a (new)
Paragraph 5 a (new)
5a. Calls for content hosting platforms to evaluate the risk that their content management policies of legal content pose to society e.g. public health, disinformation, and, on the basis of a presentation of reports to the relevant European Agency or European body, have a dialogue with the relevant European Agency or European body and the relevant national authorities biannually;
Amendment 125 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Suggests that content hosting platforms regularly submit transparency reports to the European Agencpublish and submit comprehensive transparency reports, including on their content policies, to the existing or new European Agency, or European body, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms make available, in an easily accessible manner, their content policies and publish their decisions on removing user-generated content on a publicly accessible database;
Amendment 144 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by platforms must be subject to audits by the European Agencrelevant, existing or new, European Agency or European body to ensure that there is compliance with the Digital Services Act;
Amendment 181 #
Motion for a resolution
Paragraph 13
Paragraph 13
13. Calls for content hosting platforms to give users the choice of whether to consent to the use ofuse targeted advertisingement based on the user’s prior interaction with content on the same content hosting platform or on third party websites, only after having obtained prior consent by the user, in accordance with Regulation (EU) 2016/679;
Amendment 184 #
Motion for a resolution
Paragraph 14
Paragraph 14
14. Further calls for users to be guaranteed an appropriate degree of influence over the criteria according to which content is curated and made visible for them; affirms that this should also include the option to opt out from any content curation;
Amendment 188 #
Motion for a resolution
Paragraph 15
Paragraph 15
15. Suggests that content hosting platforms publish all sponsoreships and advertisements madeclearly visible to their users, indicating who has paid for them, and, if applicable, on behalf of whom they are being placed at all times;
Amendment 214 #
Motion for a resolution
Paragraph 19
Paragraph 19
19. Considers that non-negotiable terms and conditions should neitherall not prevent effective access to justice in Union courts nor disenfranchise Union citizens or businesses and that the status of access rights to data; calls on the Commission to assess if the protection of access rights to personal and non-personal data with regards to protection under private international law is uncertain and leads to disadvantages for Union citizens and businesses;
Amendment 230 #
Motion for a resolution
Annex I – part A – introductory part – indent 1 a (new)
Annex I – part A – introductory part – indent 1 a (new)
- The proposal focuses on content moderation and curation, and civil and commercial law rules with respect to digital services. Other aspects, such as regulation of online market places, are not addressed, but should be included in the Regulation on Digital Services Act to be proposed by the European Commission.
Amendment 246 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 2 a (new)
Annex I – part A – part I – section 1 –indent 2 a (new)
- It should provide a dialogue between major content hosting platforms and the relevant, existing or new, European Agency or European body together with national authorities on the risk management of content management of legal content.
Amendment 255 #
Motion for a resolution
Annex I – part A – part I – section 1 –– indent 5
Annex I – part A – part I – section 1 –– indent 5
- It should fully respect Union rules protectingthe Charter of Fundamental Rights of the European Union, as well as Union rules protecting users and their safety, privacy and personal data, as well as other fundamental rights.
Amendment 262 #
Motion for a resolution
Annex I – part A – part I – section 2 – introductory part
Annex I – part A – part I – section 2 – introductory part
A European Agency on Content Management should be establishedsks the Commission to entrust an existing or new European Agency or European body with the following main tasks:
Amendment 295 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 3
Annex I – part A – part I – section 2 – indent 4 – subi. 3
- failure to provide access for the European Agencrelevant, existing or new, European Agency or European body to content moderation and curation algorithms for review;
Amendment 302 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 4
Annex I – part A – part I – section 2 – indent 4 – subi. 4
- failure to submit transparency reports to the European Agencrelevant, existing or new, European Agency or European body;
Amendment 308 #
Motion for a resolution
Annex I – part A – part I – section 3 –– introductory part
Annex I – part A – part I – section 3 –– introductory part
The Digital Services Act should contain provisions requiring content hosting platforms to regularly publish and provide transparency reports to the Agency. Suchrespective, existing or new, European Agency or European body. Such reports should be comprehensive, following a consistent methodology. Transparency reports should, in particular, include:
Amendment 315 #
Motion for a resolution
Annex I – part A – part I – section 3 –– indent 1 – subi. 3
Annex I – part A – part I – section 3 –– indent 1 – subi. 3
- the total number of removal requests complied with, and the total number of referrals of content to competent authorities,
Amendment 317 #
Motion for a resolution
Annex I – part A – part I – section 3 –– indent 1 – subi. 8
Annex I – part A – part I – section 3 –– indent 1 – subi. 8
- information on the enforcement of terms and conditions and information on the court rulings received to remove and/or delete terms and conditions for being considered illegal per Member State.
Amendment 324 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 1
Annex I – part A – part II – section 1 – indent 1
- Measures to lminimitze the data collected by content hosting platforms, based on inter alia interactions of users with content hosted on content hosting platforms, for the purpose of completing targeted advertising profiles, in particular by imposing strict conditions for the use of targeted personal advertisements and by requiring prior consent of the user.
Amendment 325 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 1
Annex I – part A – part II – section 1 – indent 1
- MEnforcement of existing measures to limit the data collected by content hosting platforms, based on inter alia interactions of users with content hosted on content hosting platforms, for the purpose of completing targeted advertising profiles, in particular by imposing strict conditions for the use of targeted personal advertisements.
Amendment 328 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 2
Annex I – part A – part II – section 1 – indent 2
- Users of content hosting platforms should be given the choice to opt in or out of receiving targeted advertisementand withdraw their consent to be subject to targeted advertisements, in line with data protection and privacy rules.
Amendment 330 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 3 – introductory part
Annex I – part A – part II – section 1 – indent 3 – introductory part
- Content hosting platforms should make available an archive of sponsoreships and advertisements that were shown to their users, including the following:
Amendment 331 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 3 – subi. 1
Annex I – part A – part II – section 1 – indent 3 – subi. 1
- whether the advertisement or sponsorship is currently active or inactive,
Amendment 332 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 3 – subi. 2
Annex I – part A – part II – section 1 – indent 3 – subi. 2
- the timespan during which the advertisement or sponsorship was active,
Amendment 333 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 3 – subi. 3
Annex I – part A – part II – section 1 – indent 3 – subi. 3
- the name and contact details of the advertisersponsor or advertiser and, if different, on behalf of whom the advertisement or the sponsorship is being placed,
Amendment 334 #
Motion for a resolution
Annex I – part A – part II – section 1 – indent 3 – subi. 6
Annex I – part A – part II – section 1 – indent 3 – subi. 6
Amendment 353 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 1
Annex I – part A – part II – section 4 – indent 1
- include the effective enforcement of existing measures ensuring that non- negotiable terms and conditions do not include provisions regulating private international law matters to the detriment of access to justice,
Amendment 357 #
Motion for a resolution
Annex I – part A – part II – section 4 – indent 2
Annex I – part A – part II – section 4 – indent 2
- include measures clarifying private international law rules as regards data in a way that isto inter alia consider the activities of platforms, so that they are not detrimental to Union subjects,
Amendment 371 #
Motion for a resolution
Annex I – part B – recital 6 a (new)
Annex I – part B – recital 6 a (new)
(6a) In order to ensure evaluation of the risks presented by the content amplification, this Regulation establishes a biannual dialogue on content management policies of legal content between major content hosting platforms and the respective, existing or new European Agency, or European body together with relevant national authorities.
Amendment 375 #
Motion for a resolution
Annex I – part B – recital 7
Annex I – part B – recital 7
(7) In order to ensure, inter alia, that users can assert their rights they should be given an appropriate degree of influence over the curation of content made visible to them, including the possibility to opt out of any content curation altogether. In particular, users should not be subject to curation without specific consent.
Amendment 394 #
Motion for a resolution
Annex I – part B – recital 15
Annex I – part B – recital 15
(15) In order to ensure that users and notifiers to make use of referral to independent dispute settlement bodies as a first step, it must be emphasised that such referral should not preclude any subsequent court action. Given that content hosting platforms which enjoy a dominant position on the market can particularly gain from the introduction of independent dispute settlement bodies, it is appropriate that they take responsibility for the financing of such bodies. These bodies shall be provided with adequate resources to ensure their competence and independence.
Amendment 396 #
Motion for a resolution
Annex I – part B – recital 16
Annex I – part B – recital 16
(16) Users should have the right to referral to a fair and independent dispute settlement body, as an alternative dispute settlement mechanism, to contest a decision taken by a content hosting platform following a notice concerning content they uploaded. Notifiers should have this right if they would have had legal standing in a civil procedure regarding the content in question.
Amendment 397 #
Motion for a resolution
Annex I – part B – recital 17
Annex I – part B – recital 17
(17) As regards jurisdiction, the competent independent dispute settlement body should be that located in the Member State in which the content forming the subject of the dispute has been uploaded. For natural persons, it should always be possible to bring complaints to the independent dispute body of their Member States.
Amendment 400 #
Motion for a resolution
Annex I – part B – recital 19
Annex I – part B – recital 19
(19) This Regulation should include obligations to report on its implementation and to review it within a reasonable time. For this purpose, the independent dispute settlement bodies established pursuant to this Regulation should submit reports on the number of referrals brought before them, including the number of referrals dealt withthe decisions taken – anonymising personal data as appropriate – including the number of referrals dealt with, data on systemic problems, trends and the identification of traders not complying with the decisions of the alternative dispute settlement body.
Amendment 405 #
Motion for a resolution
Annex I – part B – recital 21
Annex I – part B – recital 21
(21) Action at Union level as set out inThe application of this Regulation wshould be substantially enhanced with the establishment of a Unioclosely monitored by an existing or new European aAgency tasked with monitoring and, or European body tasked, in particular, to ensuringe compliance by content hosting platforms with the provisions of this Regulation. The Agencrespective Agency or European body should review compliance with the standards laid down for content management on the basis of transparency reports and an audit of algorithms employed by content hosting platforms for the purpose of content management ‒
Amendment 416 #
Motion for a resolution
Annex I – part B – Article 3 –point 1 a (new)
Annex I – part B – Article 3 –point 1 a (new)
(1a) 'Dominant platforms' or 'dominant content hosting platforms' means an information society service with several of the following characteristics: (a) ‘bottleneck power’ – which means the capacity to develop or preserve its user base because of network effects which lock-in a significant part of its users, or its positioning in the downstream market allows it to create economic dependency; (b) a considerable size in the market, measured either by the number of active users or by the annual global turnover of the platform; (c) integration into an business or network environment controlled by its group or parent company, which allows for leveraging market power from one market into an adjacent market; (d) a gatekeeper role for a whole category of content or information; (e) access to large amounts of high quality personal data, either provided by users or inferred about users based on monitoring their online behaviour. Data indispensable for providing and improving a similar service, as well as being difficult to access or replicate by potential competitors;
Amendment 421 #
Motion for a resolution
Annex I – part B – Article 3 –point 5 a (new)
Annex I – part B – Article 3 –point 5 a (new)
(5a) 'Sponsorship' means content payed for or placed on behalf of a third party;
Amendment 427 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 1 a (new)
Annex I – part B – Article 4 – paragraph 1 a (new)
1a. Dominant content hosting platforms shall evaluate the risks of their content management policies.
Amendment 431 #
Motion for a resolution
Annex I – part B – Article 4 – paragraph 4
Annex I – part B – Article 4 – paragraph 4
4. Content hosting platforms shall provide users with an appropriate degree of influence over the curation of content made visible to them, including the choice of opting out of content curation altogether. In particular, users shall not be subject to content curation without their specific prior consent.
Amendment 436 #
Motion for a resolution
Annex I – part B – Article 4 a (new)
Annex I – part B – Article 4 a (new)
Article 4a Structured risk dialogue on content curation As part of a structured risk dialogue with the existing or new European Agency, or European body together with the relevant national authorities, the dominant content hosting platforms shall present a report to the Commission or relevant Agency or European body on their risk management of content curation on their platform and how they mitigate these risks.
Amendment 439 #
Motion for a resolution
Annex I – part B – Article 5 – subparagraph 1
Annex I – part B – Article 5 – subparagraph 1
Any natural or legal person or public body to which content is provided through a website or, application, or another software, shall have the right to issue a notice pursuant to this Regulation.
Amendment 446 #
Motion for a resolution
Annex I – part B – Article 7 –introductory part
Annex I – part B – Article 7 –introductory part
A notice regarding content shall be made in writing and shall include at least the following information:
Amendment 447 #
Motion for a resolution
Annex I – part B – Article 7 –point a
Annex I – part B – Article 7 –point a
(a) a link to the content in question and, where appropriate, e.g. video, a timestamp;
Amendment 461 #
Motion for a resolution
Annex I – part B – Article 10 – introductory part
Annex I – part B – Article 10 – introductory part
Once a decision has been taken, content hosting platforms shall inform all parties involved in the notice procedure about the outcome of the decision, providing the following information in a clear and simple manner:
Amendment 464 #
Motion for a resolution
Annex I – part B – Article 10– point b
Annex I – part B – Article 10– point b
(b) whether the decision was made by a human or an algorithm and in the latter case, whether a human review has taken place;
Amendment 465 #
Motion for a resolution
Annex I – part B – Article 10 – point c
Annex I – part B – Article 10 – point c
(c) information about the possibility for review as referred to in Article 11 orand judicial redress for either party.
Amendment 467 #
Motion for a resolution
Annex I – part B – Article 11
Annex I – part B – Article 11
Amendment 475 #
Motion for a resolution
Annex I – part B – Article 13 – paragraph 1
Annex I – part B – Article 13 – paragraph 1
1. Member States shall establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse when decisions on content moderation are appealed against. The independent dispute settlement bodies should as a minimum comply with the quality requirements for consumer ADR bodies set down under Directive 2013/11/EU.
Amendment 477 #
Motion for a resolution
Annex I – part B – Article 13 – paragraph 3
Annex I – part B – Article 13 – paragraph 3
3. The referral of a question regarding content moderation to an independent dispute settlement body shall not preclude a user from being able to have further recourse in the courts unless the dispute has been settled by common agreement.
Amendment 479 #
Motion for a resolution
Annex I – part B – Article 13 – paragraph 4
Annex I – part B – Article 13 – paragraph 4
4. Content hosting platforms that enjoy a dominant position on the market shall contribute financially to the operating costs of the independent dispute settlement bodies through a dedicated fund. Member States shall ensure these bodies are provided with adequate resources.
Amendment 482 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 1
Annex I – part B – Article 14 – paragraph 1
1. The uploader shall have the right to refer a case of content moderation to the competent independent dispute settlement body where the content hosting platform has decided to remove or, take down or make invisible content, or otherwise to act in a manner that is contrary to the action preferred by the uploader as expressed by the uploader.
Amendment 484 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 3
Annex I – part B – Article 14 – paragraph 3
3. As regards jurisdiction, the competent independent dispute settlement body shall be that located in the Member State in which the content that is the subject of the dispute has been uploaded. For natural persons, it should always be possible to bring complaints to the independent dispute body of their Member States.
Amendment 486 #
Motion for a resolution
Annex I – part B – Article 14 – paragraph 4
Annex I – part B – Article 14 – paragraph 4
4. Where the notifier has the right to refer a case of content moderation to an independent dispute settlement body in accordance with paragraph 2, the notifier may refer the case to the independent dispute settlement body located in the Member State of habitual residence of the notifier or the uploader, if the latter is using the service for non-commercial purposes.