BETA

19 Amendments of Luigi MORGANO related to 2016/0151(COD)

Amendment 55 #
Proposal for a directive
Recital 8
(8) In order to ensure coherence and give certainty to businesses and Member States' authorities, the notion of "incitement to hatred" should, to the appropriate extent, be aligned to the definition in the Council Framework Decision 2008/913/JHA of 28 November 2008 on combating certain forms and expressions of racism and xenophobia by means of criminal law which defines hate speech as "publicly inciting to violence or hatred". This should include aligning the grounds on which incitement to violence or hatred is based. Provisions concerning hate speech should be applied in line with the jurisprudence of the European Court of Human Rights concerning the right to freedom of expression and information.
2016/12/06
Committee: LIBE
Amendment 59 #
Proposal for a directive
Recital 9
(9) In order to empower viewers, including particular parents and minors, in making informed decisions about the content to be watched, it is necessary that audiovisual media service providers provide sufficient information about content that may impair minors' physical, mental or moral development. This could be done, for instance, through a system of content descriptors indicating the nature of the content. Content descriptors could be delivered through written, graphical or acoustic means.
2016/12/06
Committee: LIBE
Amendment 61 #
Proposal for a directive
Recital 9 a (new)
(9a) The consumption of linear TV channels in Europe remains prominent in comparison to other media services, as demonstrated by the study "Linear and on-demand audiovisual media services in Europe 2015", published by the European Audiovisual Observatory in June 2016, which shows an overall good condition of linear TV in Europe, with the number of available linear TV channels in 2015 increasing on average of 46% compared to 2009;
2016/12/06
Committee: LIBE
Amendment 62 #
Proposal for a directive
Recital 9 b (new)
(9b) There has been a slight decrease in Europe of the fruition by young people of linear TV channels, as demonstrated by the study " Measurement of Fragmented Audiovisual Audiences", published by the European Audiovisual Observatory in November 2015, which shows in 2014 an average decrease in the European Union of only 4% of young people aged 12-34, compared to 2011;
2016/12/06
Committee: LIBE
Amendment 63 #
Proposal for a directive
Recital 9 c (new)
(9c) A ban for television broadcasts by broadcasters under their jurisdiction for programmes which might seriously impair the physical, mental or moral development of minors and technical measures in the case of on-demand services have been adopted by all Member States, as demonstrated by the study "Analysis of the implementation of the provisions contained in the AVMSD concerning the protection of minors", published by the European Audiovisual Observatory in November 2015;
2016/12/06
Committee: LIBE
Amendment 98 #
Draft legislative resolution
Citation 5 a (new)
- having regard to the study on "Linear and on-demand audiovisual media services in Europe 2015", published by the European Audiovisual Observatory in June 2016,
2016/10/27
Committee: CULT
Amendment 99 #
Draft legislative resolution
Citation 5 b (new)
- having regard to the study "Analysis of the implementation of the provisions contained in the AVMSD concerning the protection of minors", published by the European Audiovisual Observatory in November 2015,
2016/10/27
Committee: CULT
Amendment 99 #
Proposal for a directive
Recital 31 a (new)
(31a) The 2011 EU Agenda for the Rights of the Child defines "the Treaties, the Charter of Fundamental Rights of the European Union and the UN Convention on the Rights of the Child (UNCRC) as a common basis for all EU action which is relevant to children". Articles 5 and 19 of the UNCRC are of particular relevance for the protection of children in audiovisual media services.
2016/12/06
Committee: LIBE
Amendment 111 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article 1 a (new)
(2a) The following article is inserted: 'Article 1a 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate, effective and efficient measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of individuals or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect all citizens from programmes and user-generated videos containing incitement to commit terrorist acts and justifying terrorism; (c) protect all citizens from programmes and user-generated videos containing intentional and continuous persecution directed against an individual or a group of individuals; (d) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. Such content shall, in any case, only be made available in such a way as to ensure that minors will not normally hear or see it. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on-demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2. Such complaint and redress mechanisms shall ensure the effective and permanent removal of contents referred to in paragraph 1. 4. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation. Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraphs 2 and 3 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 30. In case co-regulation or self-regulation prove to be ineffective, national regulatory bodies are entitled to exercise their effective powers.'
2016/12/06
Committee: LIBE
Amendment 112 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article 1 a (new)
(2a) The following article is inserted: 'Article 1a 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. Such content shall in any case only be made available in such a way as to ensure that minors will not normally hear or see it. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation as provided for in Article -2f(3) and (4). Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraph 2 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 29. When adopting such measures the Member States shall respect the conditions set by applicable Union law, in particular Articles 14 and 15 of Directive 2000/31/EC or Article 25 of Directive 2011/93/EU. 4. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2.'
2016/12/06
Committee: LIBE
Amendment 113 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 b (new)
Directive 2010/13/EU
Chapter II – Article 1 b (new)
(2b) The following article is inserted: 'Article 1b 1. Member States shall ensure that audiovisual commercial communications provided by media service providers and video-sharing platform providers under their jurisdiction comply with the following requirements: (a) audiovisual commercial communications shall be readily recognisable as such. Surreptitious audiovisual commercial communication shall be prohibited; (b) audiovisual commercial communications shall not use subliminal techniques; (c) audiovisual commercial communications shall not: i. prejudice respect for human dignity; ii. encourage behaviour prejudicial to health or safety; iii. gratuitously offend or insult religious groups or members thereof with respect to their religious affiliation, or their religious convictions or symbols; iv. encourage behaviour grossly prejudicial to the protection of the environment; v. contain sexualisation of children or degrading depictions of women; (d) all forms of audiovisual commercial communications for cigarettes and other tobacco products shall be prohibited; (e) audiovisual commercial communications for alcoholic beverages shall not be aimed specifically at minors and shall not encourage immoderate consumption of such beverages; (f) audiovisual commercial communication for medicinal products and medical treatment available only on prescription in the Member State within whose jurisdiction the media service provider falls shall be prohibited; (g) audiovisual commercial communications shall not cause physical or moral detriment to minors. Therefore they shall not directly exhort minors to buy or hire a product or service by exploiting their inexperience or credulity, directly encourage them to persuade their parents or others to purchase the goods or services being advertised, exploit the special trust minors place in parents, teachers or other persons, or unreasonably show minors in dangerous situations. (h) pornography, including representations susceptible to incite to hatred based on sex, is prohibited in all forms of audiovisual commercial communications; 2. Member States and the Commission shall encourage the development of self- and co-regulatory codes of conduct regarding inappropriate audiovisual commercial communications.'
2016/12/06
Committee: LIBE
Amendment 114 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 c (new)
Directive 2010/13/EU
Chapter II – Article 1 c (new)
(2c) The following article is inserted: 'Article 1c 1. Member States shall, by appropriate means, ensure, within the framework of their legislation, that media service providers and video-sharing platform providers under their jurisdiction effectively comply with the provisions of this Directive. 2. Member States shall remain free to require media service providers and video- sharing platform providers under their jurisdiction to comply with more detailed or stricter rules with regard to Articles -2 to -2e, Article 7, Article 13, Article 16, Article 17, Articles 19 to 26, Articles 30 and 30a provided that such rules are in compliance with Union law and in respect of communicative freedoms. 3. Member States shall encourage co- and self-regulation through codes of conduct adopted at national level in the fields coordinated by this Directive to the extent permitted by their legal systems. Those codes shall be broadly accepted by stakeholders in the Member States concerned, in particular parents' associations active on protection of minors. Such associations shall be involved in the drafting of these codes. The codes of conduct shall clearly and unambiguously set out their objectives. They shall provide for regular, transparent and independent monitoring and evaluation of the achievement of the objectives aimed at, with the full involvement of the above-said associations. They shall provide for effective enforcement, including when appropriate effective and proportionate sanctions. 4. The Commission and ERGA shall encourage media service providers and video-sharing platform providers to exchange best practices on co-regulatory systems across the Union. 5. In co-operation with the Member States, the Commission shall facilitate the development of Union codes of conduct in consultation with media service providers and video-sharing platform providers where appropriate. Draft Union codes of conduct and amendments or extensions to existing Union codes of conduct shall be submitted to the Commission by the signatories of these codes. The contact committee established pursuant Article 29 shall decide on the drafts, amendments or extensions of those codes. The Commission shall publish those codes. 6. If a national independent regulatory body concludes that any code of conduct or parts of it have proven to be not effective enough the Member State of this regulatory body remains free to require media service providers and video- sharing platform providers under their jurisdiction to comply with more detailed or stricter rules in compliance with Union law and in respect of communicative freedoms. Such legislation has to be reported to the Commission without delay. 7. Directive 2000/31/EC shall apply unless otherwise provided for in this Directive. In the event of a conflict between a provision of Directive 2000/31/EC and a provision of this Directive, the provisions of this Directive shall prevail, unless otherwise provided for in this Directive.'
2016/12/06
Committee: LIBE
Amendment 127 #
Proposal for a directive
Recital 8
(8) In order to ensure coherence and give certainty to businesses and Member States' authorities, the notion of "incitement to hatred" should, to the appropriate extent, be aligned to the definition in the Council Framework Decision 2008/913/JHA of 28 November 2008 on combating certain forms and expressions of racism and xenophobia by means of criminal law which defines hate speech as "publicly inciting to violence or hatred". This should include aligning the grounds on which incitement to violence or hatred is based. Provisions concerning hate speech should be applied in line with the jurisprudence of the European Court of Human Rights concerning the right to freedom of expression and information.
2016/10/27
Committee: CULT
Amendment 137 #
Proposal for a directive
Recital 9
(9) In order to empower viewers, including particular parents and minors, in making informed decisions about the content to be watched, it is necessary that audiovisual media service providers provide sufficient information about content that may impair minors' physical, mental or moral development. This could be done, for instance, through a system of content descriptors indicating the nature of the content. Content descriptors could be delivered through written, graphical or acoustic means.
2016/10/27
Committee: CULT
Amendment 305 #
Proposal for a directive
Recital 31 a (new)
(31a) The 2011 EU Agenda for the Rights of the Child defines "the Treaties, the Charter of Fundamental Rights of the European Union and the UN Convention on the Rights of the Child (UNCRC) as a common basis for all EU action which is relevant to children". Articles 5 and 19 of the UNCRC are of particular relevance for the protection of children in audiovisual media services.
2016/10/27
Committee: CULT
Amendment 365 #
Proposal for a directive
Article 1 – paragraph 1 – point 1 – point b
Directive 2010/13/EU
Article 1 – paragraph 1 – point a a – point i
(i) the service consists of the storage of a large amount of programmes or user- generated videos, for over the selection of which the video- sharing platform provider does not have editorial responsibilityexercise effective control;
2016/10/27
Committee: CULT
Amendment 410 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article –2 (new)
(2a) The following article is inserted: 'Article -2 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. Such content shall in any case only be made available in such a way as to ensure that minors will not normally hear or see it. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation as provided for in Article -2f(3) and (4). Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraph 2 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 29. When adopting such measures the Member States shall respect the conditions set by applicable Union law, in particular Articles 14 and 15 of Directive 2000/31/EC or Article 25 of Directive 2011/93/EU. 4. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2.'
2016/10/27
Committee: CULT
Amendment 417 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 b (new)
Directive 2010/13/EU
Chapter II – Article –2 a (new)
(2b) The following article is inserted: 'Article -2a 1. Member States shall ensure that audiovisual commercial communications provided by media service providers and video-sharing platform providers under their jurisdiction comply with the following requirements: (a) audiovisual commercial communications shall be readily recognisable as such. Surreptitious audiovisual commercial communication shall be prohibited; (b) audiovisual commercial communications shall not use subliminal techniques; (c) audiovisual commercial communications shall not: (i) prejudice respect for human dignity; (ii) encourage behaviour prejudicial to health or safety; (iii) gratuitously offend or insult religious groups or members thereof with respect to their religious affiliation, or their religious convictions or symbols; (iv) encourage behaviour grossly prejudicial to the protection of the environment; (v) contain sexualisation of children or degrading depictions of women; (d) all forms of audiovisual commercial communications for cigarettes and other tobacco products shall be prohibited; (e) audiovisual commercial communications for alcoholic beverages shall not be aimed specifically at minors and shall not encourage immoderate consumption of such beverages; (f) audiovisual commercial communication for medicinal products and medical treatment available only on prescription in the Member State within whose jurisdiction the media service provider falls shall be prohibited; (g) audiovisual commercial communications shall not cause physical or moral detriment to minors. Therefore they shall not directly exhort minors to buy or hire a product or service by exploiting their inexperience or credulity, directly encourage them to persuade their parents or others to purchase the goods or services being advertised, exploit the special trust minors place in parents, teachers or other persons, or unreasonably show minors in dangerous situations. (h) pornography, including representations susceptible to incite to hatred based on sex, is prohibited in all forms of audiovisual commercial communications; 2. Member States and the Commission shall encourage the development of self- and co-regulatory codes of conduct regarding inappropriate audiovisual commercial communications.'
2016/10/27
Committee: CULT
Amendment 429 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 g (new)
Directive 2010/13/EU
Chapter II – Article – 2 f (new)
(2g) The following article is inserted: 'Article -2f 1. Member States shall, by appropriate means, ensure, within the framework of their legislation, that media service providers and video-sharing platform providers under their jurisdiction effectively comply with the provisions of this Directive. 2. Member States shall remain free to require media service providers and video- sharing platform providers under their jurisdiction to comply with more detailed or stricter rules with regard to Articles -2 to -2e, Article 7, Article 13, Article 16, Article 17, Articles 19 to 26, Articles 30 and 30a provided that such rules are in compliance with Union law and in respect of communicative freedoms. 3. Member States shall encourage co- and self-regulation through codes of conduct adopted at national level in the fields coordinated by this Directive to the extent permitted by their legal systems. Those codes shall be broadly accepted by stakeholders in the Member States concerned, in particular parents' associations active on protection of minors. Such associations shall be involved in the drafting of these codes. The codes of conduct shall clearly and unambiguously set out their objectives. They shall provide for regular, transparent and independent monitoring and evaluation of the achievement of the objectives aimed at, with the full involvement of the above-said associations. They shall provide for effective enforcement, including when appropriate effective and proportionate sanctions. 4. The Commission and ERGA shall encourage media service providers and video-sharing platform providers to exchange best practices on co-regulatory systems across the Union. 5. In co-operation with the Member States, the Commission shall facilitate the development of Union codes of conduct in consultation with media service providers and video-sharing platform providers where appropriate. Draft Union codes of conduct and amendments or extensions to existing Union codes of conduct shall be submitted to the Commission by the signatories of these codes. The contact committee established pursuant Article 29 shall decide on the drafts, amendments or extensions of those codes. The Commission shall publish those codes. 6. If a national independent regulatory body concludes that any code of conduct or parts of it have proven to be not effective enough the Member State of this regulatory body remains free to require media service providers and video- sharing platform providers under their jurisdiction to comply with more detailed or stricter rules in compliance with Union law and in respect of communicative freedoms. Such legislation has to be reported to the Commission without delay. 7. Directive 2000/31/EC shall apply unless otherwise provided for in this Directive. In the event of a conflict between a provision of Directive 2000/31/EC and a provision of this Directive, the provisions of this Directive shall prevail, unless otherwise provided for in this Directive.'
2016/10/27
Committee: CULT