21 Amendments of Stéphane SÉJOURNÉ related to 2020/2019(INL)
Amendment 1 #
Motion for a resolution
Citation 3 a (new)
Citation 3 a (new)
- having regard to Directive 2013/11/EU of the European Parliament and of the Council of 21 May 2013 on alternative dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004, Regulation (EU) No 524/2013 of the European Parliament and of the Council of 21 May 2013 on online dispute resolution for consumer disputes and amending Regulation (EC) No 2006/2004 and Directive2009/22/EC (Regulation on consumer ODR) and Directive 2009/22/EC (Directive on consumer ADR), and Directive 2008/52/EC of the European Parliament and of the Council of 21 May 2008 on certain aspects of mediation in civil and commercial matters,
Amendment 30 #
Motion for a resolution
Recital D
Recital D
D. whereas ex-post competition law enforcement alone cannot effectively address the impact of the market dominance of certain online platforms on fair competition in the digital single market; whereas competition law applied to the digital economy sector needs to be redefined in order to equip the sector with effective means to take into account the market power of digital actors;
Amendment 35 #
Motion for a resolution
Recital E
Recital E
E. whereas content hosting platforms evolved from involving the mere display of content into sophisticated bodies and market players, in particular in the case of social networks that harvest and exploit usage data; whereas users have reasonable grounds to expect fair terms for the usage of such platforms; whereas users, whether private individuals or legal persons, have objective reasons to require fair terms with respect to access, transparency, pricing and conflict resolution;
Amendment 38 #
Motion for a resolution
Recital E a (new)
Recital E a (new)
Ea. whereas, in the context of transactions, the online marketplace contains grey areas, as some websites or online marketplaces are used to sell products in violation of the rules applicable in EU countries, and whereas it is therefore important that measures be taken against internet service providers to stop or prevent infringements of intellectual property rights and to ensure consumer safety;
Amendment 42 #
Motion for a resolution
Recital F
Recital F
F. whereas content hosting platforms may determine what content is shown to their users, thereby profoundly influencing the way we obtain and communicate information, to the point that content hosting platforms have de facto become a public spaces in the digital sphere, causing people to be dispossessed, deprived of their rights, and provoking increasingly serious interference in the functioning of democratic life and repeated violations of fundamental rights; whereas public spaces must be managed in a manner that respects all fundamental rights and the civil law rights of the users;
Amendment 56 #
Motion for a resolution
Recital H
Recital H
H. whereas content hosting platforms often employ automated content removal mechanisms that raise legi; whereas such mechanisms, which are highly sophistimcate rule of law concerns, in particular when they are encouraged to employ such mechanismsd and supported by artificial intelligence, raise legitimate concerns, in particular when content hosting platforms employ them pro-actively and voluntarily, resulting in contentthe removal taking place without a clear legal basis, which is in contravention of Article 10 of the European Convention on Human Rights, stating that formalities, conditions, restrictions or penalties governing the exercise of freedom of expression and information must be prescribed by lawof illegal, illicit or counterfeit content;
Amendment 59 #
Motion for a resolution
Recital H a (new)
Recital H a (new)
Ha. whereas freedom of expression is a fundamental right enshrined in the Charter of Fundamental Rights of the European Union, which, however, cannot lead to the expression of hate, racist, anti- Semitic, xenophobic or homophobic content, and whereas appropriate ways and means are needed as a matter of urgency to tackle the extremely serious violations currently taking place;
Amendment 74 #
Motion for a resolution
Recital P
Recital P
P. whereas access to data isand its retention are an important factor in the growth of the digital economy; whereas the interoperability of data can, by removing lock-in effects, play an important part in ensuring that fair market conditions exist, on condition that access to data and its retention can be regulated by means of appropriate legal standards;
Amendment 87 #
Motion for a resolution
Paragraph 2
Paragraph 2
2. Proposes that the Digital Services Act include a regulation that establishes contractual rights as regards content management, lays down transparent, binding and uniform standards and procedures for content moderation, and guarantees accessible and independent recourse to judicial redresbe preceded by an impact assessment to evaluate, analyse and propose appropriate European rules on content management, setting out the responsibility of each of the partners and the development of fair and transparent procedures between platforms, internet users and users;
Amendment 90 #
Motion for a resolution
Paragraph 2 a (new)
Paragraph 2 a (new)
2a. Considers that, in the context of the development of online services and in a globalised digital world, the country of origin principle may be unsuitable for reasons recognised in the case law of the Court of Justice of the European Union, in particular as regards consumer protection and intellectual property. The objectives of these platforms are primarily driven by the search for countries where regulations are less restrictive in a number of areas, whether to do with taxation or in connection with illegal or illicit activities; whereas, as a result, it would certainly be useful, in sectors where it is not already established, to apply instead the principle of the country of destination, which would make it possible in future to remedy certain shortcomings in the principle of the law of the country of origin;
Amendment 95 #
Motion for a resolution
Paragraph 2 b (new)
Paragraph 2 b (new)
2b. Notes that transparency requirements must be applied to certain platforms in order to ensure that their operation in a closed system does not affect consumer choice, influence their behaviour or constitute a barrier to the freedoms of opinion or expression; stresses that in the case of an online trading platform, the use of any identical product or service or a distinctive sign similar to a recognised trademark poses a risk of confusion on the part of the public and damage to the trademark itself; when the service provider becomes aware of such a risk, it must withdraw, or make it impossible to access, the information or the product as soon as possible;
Amendment 123 #
Motion for a resolution
Paragraph 5 a (new)
Paragraph 5 a (new)
5a. Recalls that currently content moderation at European level is done on the basis of injunctions which have no legal force, and that the Commission only requires platforms to moderate the distribution of hate content or remove of terrorist content; recalls that the power to moderate should be removed from the platforms themselves, and that, as part of the impact assessment, consideration should be given to the best way of entrusting this moderation to a fully independent external body;
Amendment 125 #
Motion for a resolution
Paragraph 6
Paragraph 6
6. Suggests that content hosting platforms regularly submit transparency reports to the European Agencpublish and submit comprehensive transparency reports, including on their content policies, to the existing or new European Agency, or European body, concerning the compliance of their terms and conditions with the provisions of the Digital Services Act; further suggests that content hosting platforms make available, in an easily accessible manner, their content policies and publish their decisions on removing user-generated content on a publicly accessible database;
Amendment 141 #
Motion for a resolution
Paragraph 8
Paragraph 8
8. Takes the firm position that the Digital Services Act must not contain provisions forcing content hosting platforms to employ any form ofeffective, transparent and fully automated ex-ante controls of content, and considers that any such mechanism voluntarily employed by pl; notes that the algorithms used today for the detection of harmful content and the actions carried out by human moderatforms must be subject to audits by the European Agency to ensure that there is compliance with the Digital Services Actare becoming increasingly effective and precise; notes, however, that it is clear that the quality of content moderation is largely dependent on the databases, and therefore on the human work that develops them;
Amendment 150 #
Motion for a resolution
Paragraph 8 a (new)
Paragraph 8 a (new)
8a. Stresses, therefore, that the platforms must be transparent in the processing of algorithms and of the data which train them, and have effective means of moderation, which depend on the models developed by certain international platforms whose economic model is based on maximum extraction of data for immediate reinjection into the advertising services market; it is therefore in the interest both of internet users and the user to require the platforms to be transparent as regards the choice of the tools they prioritise for the processing of algorithms and the accompanying human actions;
Amendment 182 #
Motion for a resolution
Paragraph 14
Paragraph 14
Amendment 186 #
Motion for a resolution
Paragraph 14 a (new)
Paragraph 14 a (new)
14a. Encourages diversity of opinions and beliefs on digital platforms, but considers that freedom of expression does not justify the publication of all content and that measures must be taken to ensure a balance between freedom of expression and the rights of other users; considers that the new legislation should encourage the reporting of abuse by other users;
Amendment 246 #
Motion for a resolution
Annex I – part A – part I – section 1 –indent 2 a (new)
Annex I – part A – part I – section 1 –indent 2 a (new)
- It should provide a dialogue between major content hosting platforms and the relevant, existing or new, European Agency or European body together with national authorities on the risk management of content management of legal content.
Amendment 302 #
Motion for a resolution
Annex I – part A – part I – section 2 – indent 4 – subi. 4
Annex I – part A – part I – section 2 – indent 4 – subi. 4
- failure to submit transparency reports to the European Agencrelevant, existing or new, European Agency or European body;
Amendment 382 #
Motion for a resolution
Annex I – part B – recital 9
Annex I – part B – recital 9
(9) This Regulation should not contain provisions forcing content hosting platforms to employ any form of fully automated ex-ante control of content which is effective and transparent.
Amendment 405 #
Motion for a resolution
Annex I – part B – recital 21
Annex I – part B – recital 21
(21) Action at Union level as set out inThe application of this Regulation wshould be substantially enhanced with the establishment of a Unioclosely monitored by an existing or new European aAgency tasked with monitoring and, or European body tasked, in particular, to ensuringe compliance by content hosting platforms with the provisions of this Regulation. The Agencrespective Agency or European body should review compliance with the standards laid down for content management on the basis of transparency reports and an audit of algorithms employed by content hosting platforms for the purpose of content management ‒