BETA

Activities of Clara AGUILERA related to 2020/0361(COD)

Plenary speeches (1)

Digital Services Act (continuation of debate)
2022/01/19
Dossiers: 2020/0361(COD)

Amendments (23)

Amendment 250 #
Proposal for a regulation
Recital 14
(14) The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre- determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council,39 such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. Consequently, providers of services, such as cloud infrastructure, which are provided at the request of parties other than the content providers and only indirectly benefit the latter, should not be covered by the definition of online platforms. __________________ 39Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast), OJ L 321, 17.12.2018, p. 36
2021/07/08
Committee: IMCO
Amendment 272 #
Proposal for a regulation
Recital 22
(22) In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content taking into account the potential harm the illegal content in question may create. In order to ensure a harmonised implementation of illegal content removal throughout the Union, the provider should, within 24 hours, remove or disable access to illegal content that can seriously harm public policy, public security or public health or seriously harm consumers’ health or safety. According to the well-established case-law of the Court of Justice and in line with Directive 2000/31/EC, the concept of ‘public policy’ involves a genuine, present and sufficiently serious threat which affects one of the fundamental interest of society, in particular for the prevention, investigation, detection and prosecution of criminal offences, including the protection of minors and the fight against any incitement to hatred on grounds of race, sex, religion or nationality, and violations of human dignity concerning individual persons. The concept of ‘public security’ as interpreted by the Court of Justice covers both the internal security of a Member State, which may be affected by, inter alia, a direct threat and physical security of the population of the Member State concerned, and the external security, which may be affected by, inter alia, the risk of a serous disturbance to the foreign relations of that Member State of to the peaceful coexistence of nations. Where the illegal content does not seriously harm public policy, public security, public health or consumers’ health or safety, the provider should remove or disable access to illegal content within seven days. The deadlines referred to in this Regulation should be without prejudice to specific deadlines set out Union law or within administrative or judicial orders. The provider may derogate from the deadlines referred to in this Regulation on the grounds of force majeure or for justifiable technical or operational reasons but it should be required to inform the competent authorities as provided for in this Regulation. The removal or disabling of access should be undertaken in the observance of the principle ofthe Charter of Fundamental Rights, including a high level of consumer protection and freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content.
2021/07/08
Committee: IMCO
Amendment 360 #
Proposal for a regulation
Recital 37
(37) Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. In addition, recipients of intermediary services should be able to hold the legal representative liable for non-compliance.
2021/07/08
Committee: IMCO
Amendment 606 #
Proposal for a regulation
Article 1 – paragraph 2 – point b
(b) set out uniformharmonised rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter, including a high level of consumer protection, are effectively protected.
2021/07/08
Committee: IMCO
Amendment 640 #
Proposal for a regulation
Article 1 – paragraph 5 – point i a (new)
(ia) Directive (EU) 2019/882
2021/07/08
Committee: IMCO
Amendment 765 #
Proposal for a regulation
Article 5 – paragraph 2
2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority, decisive influence or the control of the provider.
2021/07/08
Committee: IMCO
Amendment 777 #
Proposal for a regulation
Article 5 a (new)
Article 5a Liability of online platform allowing consumers to conclude distance contracts with traders 1. In addition to Article 5(1), an online platform allowing consumers to conclude distance contracts with traders shall not benefit from the liability exemption provided for in Article 5 if it does not comply with the obligations referred to in Articles 11, 13b, 13c, 14, 22 or 24a. Such liability exemption shall also not benefit the online platform if it does not comply with specific information requirements for contracts concluded on online marketplaces, in line with Article 6a(1) of the Directive 2011/83/EU of the European Parliament and of the Council. 2. The liability exemption in Article 5(1) and in paragraph 1 of this Article shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead a consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its control, authority or decisive influence. 3. For the assessment of whether the online platform has that control or authority or decisive influence over the trader, relevant criteria shall include, among others: (a) the trader-consumer contract is concluded exclusively through facilities provided on the platform; (b) the online platform operator withholds the identity of the trader or contact details until after the conclusion of the trader-consumer contract; (c) the online platform operator exclusively uses payment systems which enable the platform operator to withhold payments made by the consumer to the trader; (d) the terms of the trader-consumer contract are essentially determined by the online platform operator; (e) the price to be paid by the consumer is set by the online platform operator; or (f) the online platform is marketing the product or service in its own name rather than using the name of the trader who will supply it; 4. The liability exemption in Article 5(1) of this Regulation shall not apply in case an online platform allows consumers to conclude distance contracts with traders from third countries when: (a) there is no economic operator inside the Union liable for the product safety or when the economic operator is available but does not respond to claims or take measures to remedy the harm; and (b) the product does not comply with the relevant Union or national law; 5. Consumers concluding distance contracts with traders shall be entitled to seek redress from the online platform for infringement of the obligations laid down in this Regulation and in accordance with relevant Union and national law. 6. The online platform shall be entitled to seek redress from the trader who has used its services in case of a failure by that trader to comply with his obligations under this Regulation regarding the online platform or regarding the consumers.
2021/07/08
Committee: IMCO
Amendment 928 #
Proposal for a regulation
Article 12 – paragraph 1
1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in theiruse fair, non-discriminatory and transparent contract terms and conditions. T that information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set outshall be drafted in clear and unambiguous language and shall bare publicly available in an easily accessible format in a searchable archive of all the previous versions with their date of application.
2021/07/08
Committee: IMCO
Amendment 1132 #
Proposal for a regulation
Article 15 a (new)
Article 15a Online interface design and organisation 1. Providers of hosting services shall not distort or impair consumers’ ability to make an informed decision via the structure, function or manner of operation of their online interface or a part thereof. 2. Providers of hosting services shall design and organise their online interface in a way that enables themselves and traders to comply with their obligations under applicable Union and Member State law on consumer protection, including on product safety.
2021/07/08
Committee: IMCO
Amendment 1145 #
Proposal for a regulation
Article 17 – paragraph 1 – introductory part
1. Online platforms shall provide recipients of the service, and individuals or entities that have submitted a notice for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the decision taken by the provider of the online platform not to act upon the receipt of a notice or against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions:
2021/07/08
Committee: IMCO
Amendment 1152 #
Proposal for a regulation
Article 17 – paragraph 1 – point a
(a) decisions whether or not to remove or disable access to or restrict visibility of the information;
2021/07/08
Committee: IMCO
Amendment 1159 #
Proposal for a regulation
Article 17 – paragraph 1 – point b
(b) decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;
2021/07/08
Committee: IMCO
Amendment 1163 #
Proposal for a regulation
Article 17 – paragraph 1 – point c
(c) decisions whether or not to suspend or terminate the recipients’ account.
2021/07/08
Committee: IMCO
Amendment 1200 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 1
Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the bodyalways direct recipients to an out-of-court dispute settlement body. The information about the competent out-of-court body shall be easily accessible on the online interface of the online platform in a clear and an user-friendly manner.
2021/07/08
Committee: IMCO
Amendment 1205 #
Proposal for a regulation
Article 18 – paragraph 1 – subparagraph 2
The first subpParagraph 1 is without prejudice to the right of the recipient concerned to redress against the decision before a court in accordance with the applicable law.
2021/07/08
Committee: IMCO
Amendment 1208 #
Proposal for a regulation
Article 18 – paragraph 1 a (new)
1a. Online platforms shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body.
2021/07/08
Committee: IMCO
Amendment 1243 #
Proposal for a regulation
Article 18 – paragraph 2 a (new)
2a. Certified out-of-court dispute settlement bodies shall draw up annual reports listing the number of complaints received annually, the outcomes of the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes.
2021/07/08
Committee: IMCO
Amendment 1362 #
Proposal for a regulation
Article 21 – paragraph 2 a (new)
2a. When a platform that allows consumers to conclude distance contracts with traders becomes aware that a piece of information, a product or service poses a serious risk to the life, health or safety of consumers, it shall promptly inform the competent authorities of the Member State or Member States concerned and provide all relevant information available.
2021/07/08
Committee: IMCO
Amendment 1712 #
Proposal for a regulation
Article 30 – paragraph 1
1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, functionable and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until onfive year after the advertisement was displayed for the last time on their online interfaces. They shall ensure multi- criterion queries can be performed per advertiser and per all data points present in the advertisement, and provide aggregated data for these queries on the amount spent, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed.
2021/07/08
Committee: IMCO
Amendment 1759 #
Proposal for a regulation
Article 31 – paragraph 3
3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate., and with an easily accessible and user-friendly mechanism to search for multiple criteria, such as those reported in accordance with the obligations set out in Articles 13 and 23
2021/07/08
Committee: IMCO
Amendment 1771 #
Proposal for a regulation
Article 31 – paragraph 5
5. The Commission shall, after consulting the Board, and no later than one year after entry into force of this legislation, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.
2021/07/08
Committee: IMCO
Amendment 1804 #
Proposal for a regulation
Article 33 a (new)
Article 33a Algorithm accountability 1. When using automated decision- making, the very large online platform shall perform an assessment of the algorithms used. 2. When carrying out the assessment referred into paragraph 1, the very large online platform shall assess the following elements: (a) the compliance with corresponding Union requirements; (b) how the algorithm is used and its impact on the provision of the service; (c) the impact on fundamental rights, including on consumer rights, as well as the social effect of the algorithms; and (d) whether the measures implemented by the very large online platform to ensure the resilience of the algorithm are appropriate with regard to the importance of the algorithm for the provision of the service and its impact on elements referred to in point (c). 3. When performing its assessment, the very large online platform may seek advice from relevant national public authorities, researchers and non- governmental organisations. 4. Following the assessment, referred to in paragraph 2, the very large online platform shall communicate its findings to the Commission. The Commission shall be entitled to request additional explanation on the conclusion of the findings, or when the additional information on the findings provided are not sufficient, any relevant information on the algorithm in question in relation to points a), b), c) and d) of Paragraph 2. The very large online platform shall communicate such additional information within a period of two weeks following the request of the Commission. 5. Where the very large online platform finds that the algorithm used does not comply with point (a), or (d) of paragraph 2 of this Article, the provider of the very large online platform shall take appropriate and adequate corrective measures to ensure the algorithm complies with the criteria set out in paragraph 2. 6. Where the Commission finds that the algorithm used by the very large online platform does not comply with point (a), (c), or (d) of paragraph 2 of this Article, on the basis of the information provided by the very large online platform, and that the very large online platform has not undertaken corrective measures as referred into Paragraph 5 of this Article, the Commission shall recommend appropriate measures laid down in this Regulation to stop the infringement.
2021/07/08
Committee: IMCO
Amendment 1842 #
Proposal for a regulation
Article 34 – paragraph 2 a (new)
2a. The absence of such standards as defined in this article should not prevent the timely implementation of the measures outlined in this regulation.
2021/07/08
Committee: IMCO