BETA

12 Amendments of Pascal ARIMONT related to 2020/0361(COD)

Amendment 144 #
Proposal for a regulation
Recital 20
(20) AThe provider should not be able to benefit from exemptions from liability provided for in this Regulation where the main purpose is to engage in or facilitate illegal activities or where a provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities and does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation.
2021/07/20
Committee: JURI
Amendment 181 #
Proposal for a regulation
Recital 32
(32) The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, ois information, which should include the relevant email addresses, telephone numbers, IP addresses and other contact details necessary to ensure such compliance, should be available in respect of all types orders. Orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information.
2021/07/20
Committee: JURI
Amendment 265 #
Proposal for a regulation
Recital 58
(58) Very large online platforms should deploy the necessary and proportionate means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision- making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service.
2021/07/20
Committee: JURI
Amendment 389 #
Proposal for a regulation
Article 2 – paragraph 1 – point h a (new)
(h a) ‘editorial platform’ means an intermediary service which is in connection with a press publication within the meaning of Article 2(4) of Directive (EU) 2019/790 or another editorial media service and which allows users to discuss topics generally covered by the relevant media or to comment editorial content and which is under the supervision of the editorial team of the publication or other editorial media.
2021/07/19
Committee: JURI
Amendment 397 #
Proposal for a regulation
Article 2 – paragraph 1 – point i a (new)
(i a) 'live streaming platform services' means an information society service which main or one the main purposes is to give the public access to live broadcasted audio or video material and which it organises and promotes for profit-making purposes;
2021/07/19
Committee: JURI
Amendment 421 #
Proposal for a regulation
Article 5 – paragraph 1 a (new)
1 a. Without prejudice to specific deadlines, set out in Union law or within administrative or legal orders, providers of hosting services shall, upon obtaining actual knowledge or awareness, remove or disable access to illegal content as soon as possible and in any event: (a) within 30 minutes where the illegal content pertains to the broadcast of a live sports or entertainment event; (b) within 24 hours where the illegal content can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety; (c) within seven days in all other cases where the illegal content does not seriously harm public policy, public security, public health or consumers’ health or safety; Where the provider of hosting services cannot comply with the obligation in paragraph 1a on grounds of force majeure or for objectively justifiable technical or operational reasons, it shall, without undue delay, inform the competent authority.
2021/07/19
Committee: JURI
Amendment 423 #
Proposal for a regulation
Article 5 – paragraph 2 a (new)
2 a. Paragraph 1 shall not apply when the main purpose of the information society service is to engage in or facilitate illegal activities or when the provider of the information society service deliberately collaborates with a recipient of the services in order to undertake illegal activities.
2021/07/19
Committee: JURI
Amendment 525 #
Proposal for a regulation
Article 11 – paragraph 1
1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. The Member States may require very large online platforms to designate a legal representative in their Member State.
2021/07/19
Committee: JURI
Amendment 537 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions, which have to respect European and national law. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format.
2021/07/19
Committee: JURI
Amendment 647 #
Proposal for a regulation
Article 16 a (new)
Article 16a Notice and action mechanism 1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. 2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: (a) an explanation of the reasons why the individual or entity considers the information in question to be illegal content; (b) to the extent possible a clear indication of the electronic location of that information, and, where necessary, additional information enabling the identification of the illegal content; (c) the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU; (d) a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are to the best of their knowledge accurate and complete. 3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. 4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly send a confirmation of receipt of the notice to that individual or entity. 5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. 6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, within the timelines of Article 5 1a and in a diligent and objective manner. Where they use automated means for that processing or decision- making, they shall include information on such use in the notification referred to in paragraph 4.
2021/07/19
Committee: JURI
Amendment 743 #
Proposal for a regulation
Article 19 – paragraph 7 a (new)
7a. Online platforms shall, where possible, provide trusted flaggers with access to technical means that help them detect illegal content on a large scale.
2021/07/19
Committee: JURI
Amendment 935 #
Proposal for a regulation
Article 29 – paragraph 1 a (new)
1a. The parameters used in recommender systems shall always be fair and non-discriminatory.
2021/07/19
Committee: JURI