35 Amendments of Adam BIELAN related to 2022/0155(COD)
Amendment 167 #
Proposal for a regulation
Recital 5
Recital 5
(5) In order to achieve the objectives of this Regulation, it should cover providers of services that have the potential to be misused for the purpose of online child sexual abuse. As they are increasingly misused for that purpose, those services should include publicly available number- independent interpersonal communications services, such as messaging services and web-based e-mail services, in so far as those services as publicly available. As services which re publicly available. The mere use of a number as an identifier should not be considered to be equivalent to the use of a number to connect with publicly assigned numbers and should therefore, in itself, not be considered to be sufficient to qualify a service as a number-based interpersonal communications service. To this end, obligations under this Regulation should apply to number-independent interpersonal communications services, regardless of whether they use numbers for the provision of their service, such as messaging services, in so far as those services are publicly available and they allow users of the service to upload, disseminate and exchange images, videos and sound not provided by the provider of the service itself. Services which enable direct interpersonal and interactive exchange of information merely as a minor ancillary feature that is intrinsically linked to another service, such as chat and similar functions as part of gamingelectronic games, image-sharing and video-hosting are equally at risk of misuse, they should also be covered by this Regulation. However, given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner. should be covered by this Regulation, in so far as they allow users of the service to upload, disseminate and exchange images and videos not provided by the provider of the service itself, or that depict content other than related to gameplay. The additional assessment should be given to the services enabling transmission of sound as an ancillary feature, while considering the general purpose of that service and the risk exposure, such as in the case of electronic games. Given the inherent differences between the various relevant information society services covered by this Regulation and the related varying risks that those services are misused for the purpose of online child sexual abuse and varying ability of the providers concerned to prevent and combat such abuse, the obligations imposed on the providers of those services should be differentiated in an appropriate manner. For example, where it is necessary to involve providers of information society services, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to entity acting as data controller in accordance with Regulation (EU) 2016/697 or where that is unfeasible, to the specific provider that has the technical and operational ability to act against specific child sexual abuse material, so as to prevent and minimise any possible negative effects on the availability and accessibility of information that is not illegal content. To this end, detection obligations shall not apply to cloud computing services and web-hosting services when serving as infrastructure, given their specific role and the broad impact it would have on users utilising cloud-hosted services.
Amendment 181 #
Proposal for a regulation
Recital 16
Recital 16
(16) In order to prevent and combat online child sexual abuse effectively, providers of hosting services and providers of publicly available number-independent interpersonal communications services should take reasonable measures to mitigate the risk of their services being misused for such abuse, as identified through the risk assessment. Providers subject to an obligation to adopt mitigahould consider, in particular, the negative impacts of such measures on the fundamental rights enshrined in the Charter on all parties involved and adopt appropriate and proportionate measures pursuant to Regulation (EU) …/… [on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC]to protect children, for example by designing their online interfaces or parts thereof with the highest level of privacy, safety and security for children by default where appropriate or adopting standards for protection of children, or participating in codes of conduct for protecting children. Providers subject to an obligation to adopt mitigation measures pursuant to Regulation (EU) 2022/2065 may consider to which extent mitigation measures adopted to comply with that obligation, which may include targeted measures to protect the rights of the child, including age verification andassurance through parental control tools, may also serve to address the risk identified in the specific risk assessment pursuant to this Regulation, and to which extent further targeted mitigation measures may be required to comply with this Regulation.
Amendment 189 #
Proposal for a regulation
Recital 17 a (new)
Recital 17 a (new)
(17 a) End-to-end encryption is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. Any restrictions of encryption could potentially be abused by malicious third parties. In order to ensure effective consumer trust, nothing in this Regulation should be interpreted as the requirement to prevent, circumvent, compromise, undermine encryption in place, or prohibit providers of information society services from providing their services applying encryption, restricting or undermining such encryption in the sense of being detrimental to users’ expectations of confidential and secure communication services, for example by implementation of client side scanning or other device- related, server-side solutions or requirements to proactively forward electronic communications to third parties which may weaken or introduce vulnerabilities into the encryption. Member States should not deter nor prevent providers of information society services from providing their services applying encryption, considering that such encryption is essential for trust in and security of the digital services, and effectively prevents unauthorised third- party access.
Amendment 193 #
Proposal for a regulation
Recital 18
Recital 18
(18) In order to ensure that the objectives of this Regulation are achieved, that flexibility should be subject to the need to comply with Union law and, in particular, the requirements of this Regulation on mitigation measures. Therefore, providers of hosting services and providers of publicly available interpersonal communications services should, when designing and implementing the mitigation measures, give importance not only to ensuring their effectiveness, but also to avoiding any undue negative consequences for other affected parties, notably for the exercise of users’ fundamental rights. In order to ensure proportionality, when determining which mitigation measures should reasonably be taken in a given situation, account should also be taken of the financial and technological capabilities and the size of the provider concerned. When selecting appropriate mitigation measures, providers should at least duly consider the possible measures listed in this Regulation, as well as, where appropriate, other measures such as those based on industry best practices, including as established through self- regulatory cooperation, and those contained in guidelines from the Commission. Those mitigation measures should always be the least intrusive option possible, with the level of intrusiveness increasing only if justified by lack of effectiveness or implementation of the less intrusive option. When no risk has been detected after a diligently conducted or updated risk assessment, providers should not be required to take any mitigation measures.
Amendment 200 #
Proposal for a regulation
Recital 20 a (new)
Recital 20 a (new)
(20 a) Having regard to the need to take due account of the fundamental rights guaranteed under the Charter of all parties concerned, any action taken by a provider of relevant information society services should be strictly targeted, in the sense that it should serve to detect, remove or disable access to the specific items of information considered to constitute child sexual abuse online, without unduly affecting the freedom of expression and of information of recipients of the service. Orders should therefore, as a general rule, be directed to the entity acting as a data controller or where that is unfeasible, to the specific provider of relevant information society services that has the technical and operational ability to act against such specific items of child sexual abuse material, so as to prevent and minimise any possible negative effects on the availability and accessibility of information that is not illegal content. The providers of relevant information society services who receive an order on the basis of which they cannot, for technical or operational reasons, remove the specific item of information, should inform the person or entity who submitted the order.
Amendment 232 #
Proposal for a regulation
Recital 70
Recital 70
(70) This Regulation recognises and reinforces the key role of hotlines in optimising the fight against child sexual abuse online at the Union level. Hotlines are at the forefront of detecting new child sexual abuse material and have a track record of proven capability in the rapid identification and removal of child sexual abuse material from the digital environment. Longstanding Union support for both INHOPE and its member hotlines recognises that hotlines are in the frontline in the fight against online child sexual abuse. The EU Centre should leverage the network of hotlines and encourage that they work together effectively with the Coordinating Authorities, providers of relevant information society services and law enforcement authorities of the Member States. The hotlines’ expertise and experience is an invaluable source of information on the early identification of common threats and solutions, as well as on regional and national differences across the Union.
Amendment 241 #
(b) obligations on relevant providers of hosting services and providers of interpersonal communication services to detect andinformation society services that allow for the exchange of images, videos and where applicable sound, to report online child sexual abuse;
Amendment 242 #
Proposal for a regulation
Article 1 – paragraph 1 – subparagraph 2 – point c
Article 1 – paragraph 1 – subparagraph 2 – point c
(c) obligations on providers of hostingrelevant information society services to remove or disable access to child sexual abuse material on their services;
Amendment 251 #
Proposal for a regulation
Article 2 – paragraph 1 – point a a (new)
Article 2 – paragraph 1 – point a a (new)
(a a) 'cloud computing service' means a service as defined in Article 6, point (30), of Directive (EU) 2022/2555 of the European Parliament and of the Council.
Amendment 254 #
Proposal for a regulation
Article 2 – paragraph 1 – point b
Article 2 – paragraph 1 – point b
(b) ‘interpersonal communications service’ means a publicly available service as defined in Article 2, point 5, of Directive (EU) 2018/1972, including services which enable direct interpersonal and interactive exchange of information that include videos and images, merely as a minor ancillary feature that is intrinsically linked to another service;
Amendment 275 #
Proposal for a regulation
Article 2 a (new)
Article 2 a (new)
Article 2 a Voluntary own-initiative detection Providers of relevant information society services shall be deemed eligible to carry out own-initiative investigations into, or take other measures aimed at detecting, identifying and preventing dissemination or removing child sexual abuse on their services in addition to mandatory requirements foreseen in this Regulation.
Amendment 276 #
Proposal for a regulation
Article 3 – paragraph 1
Article 3 – paragraph 1
1. Providers of hosting services and providers of interpersonal communications services shall identify, analyse and assess, for each such service that they offer, the risk of use of, excluding cloud computing services, and providers of number-independent interpersonal communications services shall identify, analyse and assess the recurrent systemic risk of use of their services for the purpose of online child sexual abuse. The risk assessment shall be specific to these service for the purpos and proportionate to the systemic risk considering its severity and probability, including based of onlinn the specific cases where service was misused to disseminate child sexual abuse materials.
Amendment 288 #
Proposal for a regulation
Article 3 – paragraph 2 – point b – indent 3
Article 3 – paragraph 2 – point b – indent 3
— functionalities enabling age verificationparental control, that among others allow for age assurance;
Amendment 301 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 2
Article 3 – paragraph 2 – point e – point iii – indent 2
Amendment 303 #
Proposal for a regulation
Article 3 – paragraph 2 – point e – point iii – indent 3
Article 3 – paragraph 2 – point e – point iii – indent 3
— enabling users to establish direct contact and share images or videos with other users, in particular through private communications.
Amendment 321 #
Proposal for a regulation
Article 4 – paragraph 1 – introductory part
Article 4 – paragraph 1 – introductory part
1. Providers of hosting services, excluding cloud computing services, and providers of interpersonal communications services shall take reasonable, proportionate and targeted mitigation measures, tailored to the risk identified pursuant to Article 3 and the type of service offered, to minimise that risk. Such measures shall include some or all of the following:
Amendment 327 #
Proposal for a regulation
Article 4 – paragraph 1 – point a a (new)
Article 4 – paragraph 1 – point a a (new)
(a a) introducing parental control features and functionalities that allow the parents or the legal guardians to exercise oversight and control over the child's activity;
Amendment 330 #
Proposal for a regulation
Article 4 – paragraph 1 – point a b (new)
Article 4 – paragraph 1 – point a b (new)
(a b) implementing measures to prevent and combat the dissemination of online child sex abuse materials;
Amendment 344 #
Proposal for a regulation
Article 4 – paragraph 2 – point b
Article 4 – paragraph 2 – point b
(b) applied in line with the right to privacy and the safety of individuals, targeted and proportionate in relation to that risk, taking into account, in particular, the seriousness of the risk as well as the provider’s financial and technological capabilities and the number of users;
Amendment 352 #
Proposal for a regulation
Article 4 – paragraph 3
Article 4 – paragraph 3
3. Providers of interpersonal communications services that have identified, pursuant to the risk assessment conducted or updated in accordance with Article 3, a risk of use of their services for the purpose of the solicitation of children, shall take the necessary age verification and age assessment measures to reliably identify child users on their services, enabling them to take the mitigation measurestargeted measures, such as parental control tools that enable age assurance, and other tools that adapt their online interface and protect child users from solicitation.
Amendment 386 #
Article 6 a Encrypted services Nothing in this Regulation shall be construed as prohibiting, restricting or undermining the provision or the use of encrypted services. Providers of information society services shall not be deterred nor prevented by relevant public authorities from offering encrypted services.
Amendment 394 #
Proposal for a regulation
Article 7 – paragraph 1
Article 7 – paragraph 1
1. The Coordinating Authority of establishment shall have the power, as a last resort, when all the measures in Article 3, 4 and 5 have been exhausted, to request the competent judicial authority of the Member State that designated it or another independent administrative authority of that Member State to issue a detection order requiring a provider of hosting services or a provider of interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect online child sexual abuse on a specific service, for a limited time and for the sole purpose of detecting known child sexual abuse material, a detection order requiring a provider of hosting services excluding cloud computing services, or a provider of number-independent interpersonal communications services under the jurisdiction of that Member State to take the measures specified in Article 10 to detect known child sexual abuse on a specific service or relating to specific users or groups of users. The detection order shall be limited to the information identified in the order, allow the service provider to fulfill it without the need to carry out an independent assessment of that content and search and removal can be carried out by reliable automated tools. The detection order shall be directed to the providers of hosting services, excluding cloud computing services, and number-independent interpersonal communications services that can reasonably be expected to have the technical and operational ability to act.
Amendment 524 #
Proposal for a regulation
Article 10 – paragraph 4 – point e a (new)
Article 10 – paragraph 4 – point e a (new)
(e a) ensure privacy and safety by design and by default and, where applicable, the protection of encryption;
Amendment 542 #
Proposal for a regulation
Article 12 – paragraph 3
Article 12 – paragraph 3
3. The provider shall establish and operate an accessible, effective, age- appropriate and user-friendly mechanism that allows users to flag to the provider potential online child sexual abuse on the service.
Amendment 565 #
Proposal for a regulation
Article 15 – paragraph 1 a (new)
Article 15 – paragraph 1 a (new)
Amendment 578 #
Proposal for a regulation
Article 21 – paragraph 1
Article 21 – paragraph 1
1. Providers of hosting services and where applicable cloud computing services shall provide reasonable assistance, on request, to persons residing in the Union that seek to have one or more specific items of known child sexual abuse material depicting them removed or to have access thereto disabled by the provider.
Amendment 579 #
Proposal for a regulation
Article 21 – paragraph 2 – subparagraph 1
Article 21 – paragraph 2 – subparagraph 1
Persons residing in the Union shall have the right to receive, upon their request, from the Coordinating Authority designated by the Member State where the person resides, support from the EU Centre when they seek to have a provider of hosting services and where applicable cloud computing services remove or disable access to one or more specific items of known child sexual abuse material depicting them. Persons with disabilities shall have the right to ask and receive any information relating to such support in a manner accessible to them.
Amendment 592 #
Proposal for a regulation
Article 26 – paragraph 2 – point a
Article 26 – paragraph 2 – point a
Amendment 595 #
Proposal for a regulation
Article 26 – paragraph 2 – point e
Article 26 – paragraph 2 – point e
Amendment 597 #
Proposal for a regulation
Article 26 – paragraph 3
Article 26 – paragraph 3
3. Paragraph 2 shall not prevent supervision of the Coordinating Authorities in accordance with national constitutional law, to the extent that such supervision does not affect their independence as required under this Regulation or from coordination with public authorities relevant to combat child sexual materials.
Amendment 629 #
Proposal for a regulation
Article 39 – paragraph 2
Article 39 – paragraph 2
2. The EU Centre shall establish and maintain one or more reliable and secure information sharing systems supporting communications between Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services.
Amendment 632 #
Proposal for a regulation
Article 39 – paragraph 3
Article 39 – paragraph 3
3. The Coordinating Authorities, hotlines, the Commission, the EU Centre, other relevant Union agencies and providers of relevant information society services shall use the information-sharing systems referred to in paragraph 2 for all relevant communications pursuant to this Regulation.
Amendment 635 #
Proposal for a regulation
Article 39 – paragraph 3 a (new)
Article 39 – paragraph 3 a (new)
3 a. Where the EU Centre receives a report from a hotline, or where a provider that submitted the report to the EU Centre has indicated that the report is based on the information received from a hotline, the EU Centre shall coordinate with the relevant Coordinating Authorities in order avoid duplicated reporting on the same material that has already been reported to the national law enforcement authorities by the hotlines and monitor the removal of the child sexual abuse material or cooperate with the relevant hotline to track the status.
Amendment 645 #
Proposal for a regulation
Article 83 – paragraph 1 – point a – indent 3
Article 83 – paragraph 1 – point a – indent 3
— in relation to complaints and cases submitted by users in connection to the measures taken to comply with the order, the number of complaints submitted directly to the provider, the number of cases brought before a judicial authority, the basis for those complaints and cases, the decisions taken in respect of those complaints and in those cases, the averagemedian time needed for taking those decisions and the number of instances where those decisions were subsequently reversed;
Amendment 647 #
Proposal for a regulation
Article 83 – paragraph 1 – point b
Article 83 – paragraph 1 – point b
(b) the number of removal orders issued to the provider in accordance with Article 14 and the averagemedian time needed for removing or disabling access to the item or items of child sexual abuse material in question;