BETA

12 Amendments of Cécile Kashetu KYENGE related to 2016/0151(COD)

Amendment 111 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article 1 a (new)
(2a) The following article is inserted: 'Article 1a 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate, effective and efficient measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of individuals or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect all citizens from programmes and user-generated videos containing incitement to commit terrorist acts and justifying terrorism; (c) protect all citizens from programmes and user-generated videos containing intentional and continuous persecution directed against an individual or a group of individuals; (d) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. Such content shall, in any case, only be made available in such a way as to ensure that minors will not normally hear or see it. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on-demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2. Such complaint and redress mechanisms shall ensure the effective and permanent removal of contents referred to in paragraph 1. 4. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation. Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraphs 2 and 3 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 30. In case co-regulation or self-regulation prove to be ineffective, national regulatory bodies are entitled to exercise their effective powers.'
2016/12/06
Committee: LIBE
Amendment 126 #
Proposal for a directive
Recital 8
(8) In order to ensure coherence and give legal certainty to businesses and Member States' authorities, the notion of "incitement to hatred" should, to the appropriate extent, be aligned towith the definition in the Council Framework Decision 2008/913/JHA of 28 November 2008 on combating certain forms and expressions of racism and xenophobia by means of criminal law which defines hate speech as "publicly inciting to violence or hatred". This should include aligning the grounds on which incitement to violence or hatred is based as well as those grounds not covered by Council Framework Decision 2008/913/JHA such as social origin, genetic features, language, political or any other opinion, membership of a national minority, property, birth, disability, age, gender, gender expression, gender identity, sexual orientation, residence status or health.
2016/10/27
Committee: CULT
Amendment 138 #
Proposal for a directive
Recital 9
(9) In order to empower viewers, including particular parents and minors, in making informed decisions about the content to be watched, it is necessary that audiovisual media service providers provide sufficient information about content that may impair minors' physical, or mental or moral development. This could be done, for instance, through a system of content descriptors indicating the nature of the content. Content descriptors could be delivered through written, graphical or acoustic means.
2016/10/27
Committee: CULT
Amendment 268 #
Proposal for a directive
Recital 28
(28) An important share of the content hostored on video-sharing platforms is not under the editorial responsibility of the video-sharing platform provider. However, those providers typically determine the organisation of the content, namely programmes or user-generated videos, including by automatic means or algorithms. Therefore, those providers should be required to take appropriate measures to protect minors from content that may impair their physical, or mental or moral development and protect all citizenusers from incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, race, colour, relethnic or social origion, descent or national or ethnic origin. genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender, gender expression, gender identity, sexual orientation, residence status or health.
2016/10/27
Committee: CULT
Amendment 556 #
Proposal for a directive
Article 1 – paragraph 1 – point 5 – point d
Directive 2010/13/EU
Article 4 – paragraph 7 – subparagraph 1
Member StatesThe European Commission shall encourage and facilitate co- regulation and self-regulation through codes of conduct adopted at national level in the fields coordinated by this Directive to the extent permitted by theirnational legal systems. Those codes shall be such that they are broadly accepted by the main stakeholders in the Member States concerned. The codes of conduct shall clearly and unambiguously set out their objectives. They shall provide for regular, transparent and independent monitoring and evaluation of the achievement of the objectives aimed at. They shall provide for effective and transparent enforcement, including when appropriate effective and proportionate sanctions are applied.
2016/10/27
Committee: CULT
Amendment 587 #
Proposal for a directive
Article 1 – paragraph 1 – point 9
Directive 2010/13/EU
Article 6 a - paragraph 1
1. Member States shall ensure that audiovisual media service providers provide sufficient information to viewers about content which may impair the physical, or mental or moral development of minors. For this purpose, Member States may use a system of descriptors indicating the nature of the content of an audiovisual media service.
2016/10/27
Committee: CULT
Amendment 589 #
Proposal for a directive
Article 1 – paragraph 1 – point 9
Directive 2010/13/EU
Article 6 a – paragraph 1
1. Member States shall ensure that audiovisual media service providers provide sufficient information to viewers about content which may impair the physical, or mental or moral development of minors. For this purpose, Member States may use a system of descriptors indicating the nature of the content of an audiovisual media service.
2016/10/27
Committee: CULT
Amendment 711 #
Proposal for a directive
Article 1 – paragraph 1 – point 14
Directive 2010/13/EU
Article 12 – subparagraph 1
Member States shall take appropriate measures to ensure that programmes provided by audiovisual media service providers under their jurisdiction, which may impair the physical, or mental or moral development of minors are only made available in such a way as to ensure that minors will not normally hear or see them. Such measures may include selecting the time of the broadcast, age verification tools or other technical measures. They shall be proportionate to the potential harm of the programme.
2016/10/27
Committee: CULT
Amendment 851 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 1 – point a
(a) protect minors from content which may impair their physical, or mental or moral development;
2016/10/27
Committee: CULT
Amendment 857 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 1 – point b
(b) protect all citizens from content containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, race, colour, relethnic or social origion, descent or national or ethnic origingenetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age, gender, gender expression, gender identity, sexual orientation, residence status or health.
2016/10/27
Committee: CULT
Amendment 888 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 2 – subparagraph 2 – point a
(a) definingspecifying the characteristics of and applying in the terms and conditions of the video-sharing platform providers the concepts of incitement to violence or hatred as referred to in point (b) of paragraph 1 and of content which may impair the physical, or mental or moral development of minors, in accordance with Articles 6 and 12 respectively;
2016/10/27
Committee: CULT
Amendment 990 #
Proposal for a directive
Article 1 – paragraph 1 – point 21
Directive 2010/13/EU
Article 30 – paragraph 2 – subparagraph 1
Member States shall ensure that national regulatory authorities exercise their powers impartially and transparently and in accordance with the objectives of this Directive, in particular media pluralism, non-discrimination, cultural diversity, consumer protection, internal market and the promotion of fair competition.
2016/10/27
Committee: CULT