BETA

9 Amendments of Caterina CHINNICI related to 2016/0151(COD)

Amendment 61 #
Proposal for a directive
Recital 9 a (new)
(9a) The consumption of linear TV channels in Europe remains prominent in comparison to other media services, as demonstrated by the study "Linear and on-demand audiovisual media services in Europe 2015", published by the European Audiovisual Observatory in June 2016, which shows an overall good condition of linear TV in Europe, with the number of available linear TV channels in 2015 increasing on average of 46% compared to 2009;
2016/12/06
Committee: LIBE
Amendment 62 #
Proposal for a directive
Recital 9 b (new)
(9b) There has been a slight decrease in Europe of the fruition by young people of linear TV channels, as demonstrated by the study " Measurement of Fragmented Audiovisual Audiences", published by the European Audiovisual Observatory in November 2015, which shows in 2014 an average decrease in the European Union of only 4% of young people aged 12-34, compared to 2011;
2016/12/06
Committee: LIBE
Amendment 63 #
Proposal for a directive
Recital 9 c (new)
(9c) A ban for television broadcasts by broadcasters under their jurisdiction for programmes which might seriously impair the physical, mental or moral development of minors and technical measures in the case of on-demand services have been adopted by all Member States, as demonstrated by the study "Analysis of the implementation of the provisions contained in the AVMSD concerning the protection of minors", published by the European Audiovisual Observatory in November 2015;
2016/12/06
Committee: LIBE
Amendment 98 #
Draft legislative resolution
Citation 5 a (new)
- having regard to the study on "Linear and on-demand audiovisual media services in Europe 2015", published by the European Audiovisual Observatory in June 2016,
2016/10/27
Committee: CULT
Amendment 99 #
Draft legislative resolution
Citation 5 b (new)
- having regard to the study "Analysis of the implementation of the provisions contained in the AVMSD concerning the protection of minors", published by the European Audiovisual Observatory in November 2015,
2016/10/27
Committee: CULT
Amendment 111 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article 1 a (new)
(2a) The following article is inserted: 'Article 1a 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate, effective and efficient measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of individuals or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect all citizens from programmes and user-generated videos containing incitement to commit terrorist acts and justifying terrorism; (c) protect all citizens from programmes and user-generated videos containing intentional and continuous persecution directed against an individual or a group of individuals; (d) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. Such content shall, in any case, only be made available in such a way as to ensure that minors will not normally hear or see it. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on-demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2. Such complaint and redress mechanisms shall ensure the effective and permanent removal of contents referred to in paragraph 1. 4. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation. Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraphs 2 and 3 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 30. In case co-regulation or self-regulation prove to be ineffective, national regulatory bodies are entitled to exercise their effective powers.'
2016/12/06
Committee: LIBE
Amendment 112 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article 1 a (new)
(2a) The following article is inserted: 'Article 1a 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. Such content shall in any case only be made available in such a way as to ensure that minors will not normally hear or see it. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation as provided for in Article -2f(3) and (4). Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraph 2 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 29. When adopting such measures the Member States shall respect the conditions set by applicable Union law, in particular Articles 14 and 15 of Directive 2000/31/EC or Article 25 of Directive 2011/93/EU. 4. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2.'
2016/12/06
Committee: LIBE
Amendment 410 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article –2 (new)
(2a) The following article is inserted: 'Article -2 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. Such content shall in any case only be made available in such a way as to ensure that minors will not normally hear or see it. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation as provided for in Article -2f(3) and (4). Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraph 2 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 29. When adopting such measures the Member States shall respect the conditions set by applicable Union law, in particular Articles 14 and 15 of Directive 2000/31/EC or Article 25 of Directive 2011/93/EU. 4. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2.'
2016/10/27
Committee: CULT
Amendment 416 #
Proposal for a directive
Article 1 – paragraph 1 – point 1 – point 2
Directive 2010/13/EU
Chapter II – Article –2 a (new)
(2) The following article is inserted: ‘Article -2a 1. Member States shall ensure that audiovisual commercial communications provided by media service providers and video-sharing platform providers under their jurisdiction comply with the following requirements: (a) audiovisual commercial communications shall be readily recognisable as such. Surreptitious audiovisual commercial communication shall be prohibited; (b) audiovisual commercial communications shall not use subliminal techniques, in particular shall not expose minors to aggressive, misleading and intrusive advertising; (c) audiovisual commercial communications shall not: (i) prejudice respect for human dignity; (ii) encourage behaviour prejudicial to health or safety, in particular for children as regards foods and beverages that are high in salt, sugars or fat or that otherwise do not fit national or international nutritional guidelines; (iii) encourage behaviour grossly prejudicial to the protection of the environment; (d) all forms of audiovisual commercial communications for cigarettes and other tobacco products shall be prohibited; (e) audiovisual commercial communications for alcoholic beverages shall not be aimed specifically at minors and shall not encourage immoderate consumption of such beverages; (f) audiovisual commercial communication for medicinal products and medical treatment available only on prescription in the Member State within whose jurisdiction the media service provider falls shall be prohibited; (g) audiovisual commercial communications shall not cause physical or moral detriment to minors. Therefore they shall not directly exhort minors to buy or hire a product or service by exploiting their inexperience or credulity, directly encourage them to persuade their parents or others to purchase the goods or services being advertised, exploit the special trust minors place in parents, teachers or other persons, or unreasonably show minors in dangerous situations. 2. Member States and the Commission shall encourage the development of self- and co-regulatory codes of conduct regarding inappropriate audiovisual commercial communications and facilitate exchange of best practices across the Union’
2016/10/27
Committee: CULT