BETA

Activities of Geert BOURGEOIS related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (37)

Amendment 277 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness off manifestly illegal content, act expeditious related to serious crimes, act promptly to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated tso allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal contentthat it is evident to a layperson, without any substantive analysis, that the content is illegal and related to serious crimes.
2021/07/08
Committee: IMCO
Amendment 352 #
Proposal for a regulation
Recital 35
(35) In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, very large online platforms and very large social online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.
2021/07/08
Committee: IMCO
Amendment 356 #
Proposal for a regulation
Recital 36
(36) In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location.
2021/07/08
Committee: IMCO
Amendment 361 #
Proposal for a regulation
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a permanent, sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with.
2021/07/08
Committee: IMCO
Amendment 365 #
Proposal for a regulation
Recital 38
(38) Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of protecting fundamental rights, in particular freedom of expression and of information, transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes.
2021/07/08
Committee: IMCO
Amendment 367 #
Proposal for a regulation
Recital 38 a (new)
(38a) Very large social online platforms play an essential role in the public debate. They can be considered the modern-day version of a postal service. Anyone who is barred from this handful of platforms is largely silenced. It is not appropriate that those platforms should be free to engage in censorship as they see fit, whether or not by means of automated systems, enabling them to steer the public debate (intentionally or unintentionally) in a particular direction. Moreover, practice has shown that content moderation by automated systems is context-insensitive and all too often removes humour, satire, irony, legitimate forms of protest, and political opinions. To ensure freedom of expression and of information, provision should be made for a derogation from freedom of contract for these providers of intermediary services. A universal service obligation should be imposed on very large social online platforms. Those platforms should allow anyone, in principle, to post and receive content on their platforms. They should remove, on their own initiative, only manifestly illegal content related to serious crimes. Universal service should be provided without discrimination of any kind. Any universal service tariffs should be objective, transparent, non-discriminatory and fair.
2021/07/08
Committee: IMCO
Amendment 380 #
Proposal for a regulation
Recital 40
(40) Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their sizewith the exception of micro or small enterprises as defined in Commission Recommendation 2003/361/EC, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned ('notice'), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content ('action'). Very large online social platforms should remove, on their own initiative, only manifestly illegal content related to serious crimes, including in response to a notice. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation.
2021/07/08
Committee: IMCO
Amendment 390 #
Proposal for a regulation
Recital 42
(42) Where a hosting service provider, within the limits of the rules laid down by this Regulation, decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be manifestly illegal content related to serious crimes, illegal content or incompatible with the applicable terms and conditions. This obligation should not apply to micro and small enterprises as defined in Commission Recommendation 2003/361/EC. Available recourses to challenge the decision of the hosting service provider should always include judicial redress.
2021/07/08
Committee: IMCO
Amendment 406 #
Proposal for a regulation
Recital 46
(46) Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right- holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council43 . __________________ 43Regulation (EU) 2016/794 of the European Parliament and of the Council of 11 May 2016 on the European Union Agency for Law Enforcement Cooperation (Europol) and replacing and repealing Council Decisions 2009/371/JHA, 2009/934/JHA, 2009/935/JHA, 2009/936/JHA and 2009/968/JHA, OJ L 135, 24.5.2016, p. 53deleted
2021/07/08
Committee: IMCO
Amendment 419 #
Proposal for a regulation
Recital 47
(47) The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. They should make assessments on a case-by-case basis at all times, taking into account all relevant facts and circumstances. Very large online social platforms should take particular account, albeit not in every respect, of the universal service obligation that they have in principle. Notwithstanding the universal service obligation for very large online social platforms, this is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms, within the limits of the rules established by this Regulation, from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law.
2021/07/08
Committee: IMCO
Amendment 466 #
Proposal for a regulation
Recital 53
(53) Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result. Very large social online platforms are a subcategory of very large online platforms, which people use primarily to build a social network and social relationships. Given the essential role that very large social online platforms play in the public debate and in social interaction, it is necessary to impose a universal service obligation on those platforms in addition to the obligations applicable to all very large online platforms.
2021/07/08
Committee: IMCO
Amendment 483 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service, in particular freedom of expression and of information. Such measures should be without prejudice to the universal service obligation for very large online social platforms. Very large online social platforms should allow anyone, in principle, to post and receive content on their platforms. Those platforms should remove, on their own initiative, only manifestly illegal content related to serious crimes.
2021/07/08
Committee: IMCO
Amendment 525 #
Proposal for a regulation
Recital 69
(69) The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan. Codes of conduct are without prejudice to the obligations under this Regulation, including the universal service obligation to be met by very large social online platforms.
2021/07/08
Committee: IMCO
Amendment 692 #
Proposal for a regulation
Article 2 – paragraph 1 – point g a (new)
(ga) ‘illegal content related to serious crimes’ means any information which, in itself or by its reference to an activity, including the sale of products or provision of services, appears on the list of ‘serious crimes’ in Annex I;
2021/07/08
Committee: IMCO
Amendment 693 #
Proposal for a regulation
Article 2 – paragraph 1 – point g b (new)
(gb) 'manifestly illegal content’ means content the illegality of which is evident to a layperson without any substantive analysis;
2021/07/08
Committee: IMCO
Amendment 725 #
Proposal for a regulation
Article 2 – paragraph 1 – point p
(p) ‘content moderation’ means, within the limits of the rules laid down by this Regulation, the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account;
2021/07/08
Committee: IMCO
Amendment 756 #
Proposal for a regulation
Article 5 – paragraph 1 – point a
(a) does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; ormanifestly illegal content related to serious crimes;
2021/07/08
Committee: IMCO
Amendment 760 #
Proposal for a regulation
Article 5 – paragraph 1 – point b
(b) upon obtaining such knowledge or awarenessf manifestly illegal content, acts expeditiously to remove or to disable access to theat illegal content.
2021/07/08
Committee: IMCO
Amendment 763 #
Proposal for a regulation
Article 5 – paragraph 1 a (new)
1a. For an interpretation of 'expeditiously', account shall be taken at all times of all specific circumstances, in particular the size of the service provider and the resources it has or ought to have.
2021/07/08
Committee: IMCO
Amendment 943 #
Proposal for a regulation
Article 12 – paragraph 2
2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter, in particular freedom of expression and of information.
2021/07/08
Committee: IMCO
Amendment 952 #
Proposal for a regulation
Article 12 – paragraph 2 a (new)
2a. Paragraphs 1 and 2 shall apply without prejudice to the universal service obligation that very large social online platforms have under Article 33a.
2021/07/08
Committee: IMCO
Amendment 988 #
Proposal for a regulation
Article 13 – paragraph 1 – point c
(c) without prejudice to Article 33a, the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures; providers of intermediary services shall furthermore provide clear information on the use of automated systems;
2021/07/08
Committee: IMCO
Amendment 1026 #
Proposal for a regulation
Article 14 – paragraph 1
1. Providers of hosting services, with the exception of micro or small enterprises as defined in Recommendation 2003/361/EC, shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user- friendly, and allow for the submission of notices exclusively by electronic means.
2021/07/08
Committee: IMCO
Amendment 1062 #
Proposal for a regulation
Article 14 – paragraph 3
3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness, for the purposes of Article 5 , only in respect of the specific item of information concernedmanifestly illegal content related to serious crimes.
2021/07/08
Committee: IMCO
Amendment 1097 #
Proposal for a regulation
Article 15 – paragraph 1
1. Where a provider of hosting services, within the limits of the rules laid down by this Regulation and in particular by Article 33a, decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision.
2021/07/08
Committee: IMCO
Amendment 1123 #
Proposal for a regulation
Article 15 – paragraph 4 a (new)
4a. Micro or small enterprises as defined in Commission Recommendation 2003/361/EC shall be excluded from the scope of that provision.
2021/07/08
Committee: IMCO
Amendment 1177 #
Proposal for a regulation
Article 17 – paragraph 2
2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. Complaints shall be handled in a language chosen by the service recipient.
2021/07/08
Committee: IMCO
Amendment 1257 #
Proposal for a regulation
Article 19
[...]delete
2021/07/08
Committee: IMCO
Amendment 1346 #
Proposal for a regulation
Article 20 – paragraph 3 a (new)
3a. Very large social online platforms shall take particular account of the obligation to provide a universal service in principle.
2021/07/08
Committee: IMCO
Amendment 1540 #
Proposal for a regulation
Article 25 – paragraph 4 a (new)
4a. Very large social online platforms are a subcategory of very large online platforms, which people use primarily to build a social network and social relationships.
2021/07/08
Committee: IMCO
Amendment 1618 #
Proposal for a regulation
Article 27 – paragraph 1 – point d
(d) initiating or adjusting cooperation with trusted flaggers in accordance with Article 19;deleted
2021/07/08
Committee: IMCO
Amendment 1648 #
Proposal for a regulation
Article 27 – paragraph 3 a (new)
3a. This article shall be without prejudice to the universal service obligation that very large social online platforms have under Article 33a.
2021/07/08
Committee: IMCO
Amendment 1694 #
Proposal for a regulation
Article 29 – paragraph 1
1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679.
2021/07/08
Committee: IMCO
Amendment 1807 #
Proposal for a regulation
Article 33 a (new)
Article 33a Universal service obligation for very large social online platforms 1. Very large social online platforms fulfil an essential role in the public debate and social interaction. They shall have a universal service obligation to allow anyone, in principle, to post and receive content on their platforms. They shall provide that service without discrimination. 2. Very large social online platforms shall remove, on their own initiative, only manifestly illegal content related to serious crimes. 3. In the event of misuse as defined in Article 20(1), very large social online platforms may temporarily suspend their services to service recipients that frequently provide manifestly illegal content. When making their assessments in accordance with Article 20(3) and (3a), they shall take particular account of the universal service obligation imposed on them in principle. 4. Any universal service tariffs shall be objective, transparent, non- discriminatory and fair.
2021/07/08
Committee: IMCO
Amendment 1817 #
Proposal for a regulation
Article 34 – paragraph 1 – point b
(b) electronic submission of notices by trusted flaggers under Article 19, including through application programming interfaces;deleted
2021/07/08
Committee: IMCO
Amendment 2279 #
Proposal for a regulation
Article 68 – paragraph 1 – introductory part
Without prejudice to Directive 2020/XX/EU of the European Parliament and of the Council, recipients of intermediary services shall have the right to mandate a body, organisation or association to exercise the rights referred to in Articles 17, 18 and 198 on their behalf, provided the body, organisation or association meets all of the following conditions: __________________ 52 [Reference].
2021/07/08
Committee: IMCO
Amendment 2297 #
Proposal for a regulation
Chapter V a (new)
ANNEX 1 For the purposes of this Regulation, 'serious crimes' means the following forms of crime: terrorism, human trafficking, sexual exploitation and sexual abuse, illicit drug trafficking, illicit arms trafficking, incitement to violence, money laundering, corruption, counterfeiting of means of payment, computer crime and organised crime.
2021/07/08
Committee: IMCO