BETA

31 Amendments of Silvia COSTA related to 2016/0151(COD)

Amendment 61 #
Proposal for a directive
Recital 9 a (new)
(9a) The consumption of linear TV channels in Europe remains prominent in comparison to other media services, as demonstrated by the study "Linear and on-demand audiovisual media services in Europe 2015", published by the European Audiovisual Observatory in June 2016, which shows an overall good condition of linear TV in Europe, with the number of available linear TV channels in 2015 increasing on average of 46% compared to 2009;
2016/12/06
Committee: LIBE
Amendment 62 #
Proposal for a directive
Recital 9 b (new)
(9b) There has been a slight decrease in Europe of the fruition by young people of linear TV channels, as demonstrated by the study " Measurement of Fragmented Audiovisual Audiences", published by the European Audiovisual Observatory in November 2015, which shows in 2014 an average decrease in the European Union of only 4% of young people aged 12-34, compared to 2011;
2016/12/06
Committee: LIBE
Amendment 63 #
Proposal for a directive
Recital 9 c (new)
(9c) A ban for television broadcasts by broadcasters under their jurisdiction for programmes which might seriously impair the physical, mental or moral development of minors and technical measures in the case of on-demand services have been adopted by all Member States, as demonstrated by the study "Analysis of the implementation of the provisions contained in the AVMSD concerning the protection of minors", published by the European Audiovisual Observatory in November 2015;
2016/12/06
Committee: LIBE
Amendment 98 #
Draft legislative resolution
Citation 5 a (new)
- having regard to the study on "Linear and on-demand audiovisual media services in Europe 2015", published by the European Audiovisual Observatory in June 2016,
2016/10/27
Committee: CULT
Amendment 99 #
Draft legislative resolution
Citation 5 b (new)
- having regard to the study "Analysis of the implementation of the provisions contained in the AVMSD concerning the protection of minors", published by the European Audiovisual Observatory in November 2015,
2016/10/27
Committee: CULT
Amendment 111 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article 1 a (new)
(2a) The following article is inserted: 'Article 1a 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate, effective and efficient measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of individuals or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect all citizens from programmes and user-generated videos containing incitement to commit terrorist acts and justifying terrorism; (c) protect all citizens from programmes and user-generated videos containing intentional and continuous persecution directed against an individual or a group of individuals; (d) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. Such content shall, in any case, only be made available in such a way as to ensure that minors will not normally hear or see it. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on-demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2. Such complaint and redress mechanisms shall ensure the effective and permanent removal of contents referred to in paragraph 1. 4. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation. Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraphs 2 and 3 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 30. In case co-regulation or self-regulation prove to be ineffective, national regulatory bodies are entitled to exercise their effective powers.'
2016/12/06
Committee: LIBE
Amendment 112 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article 1 a (new)
(2a) The following article is inserted: 'Article 1a 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. Such content shall in any case only be made available in such a way as to ensure that minors will not normally hear or see it. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation as provided for in Article -2f(3) and (4). Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraph 2 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 29. When adopting such measures the Member States shall respect the conditions set by applicable Union law, in particular Articles 14 and 15 of Directive 2000/31/EC or Article 25 of Directive 2011/93/EU. 4. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2.'
2016/12/06
Committee: LIBE
Amendment 132 #
Proposal for a directive
Recital 8 a (new)
(8a) In order to enable citizens to access information, evaluate media contexts, use, critically assess and create media content responsibly, citizens need to possess advanced media literacy skills. It is therefore necessary that the development of media literacy in all sections of society, for citizens of all ages, and for all media, is promoted and its progress is followed closely.
2016/10/27
Committee: CULT
Amendment 169 #
Proposal for a directive
Recital 13
(13) The market for TV broadcasting has evolved and that there is a need for more flexibility with regard to audiovisual commercial communications, in particular for quantitative rules for linear audiovisual media services, product placement and sponsorship. The emergence of new services, including without advertising, has led to a greater choice for viewers, who can easily switch to alternative offers.deleted
2016/10/27
Committee: CULT
Amendment 209 #
Proposal for a directive
Recital 19
(19) While this Directive does not increase the overall amount of admissible advertising time during the period from 7:00 to 23:00, it is important for broadcasters to have more flexibility and to be able to decide when to place advertising in order to maximise advertisers' demand and viewers' flow. The hourly limit should thus be abolished while a daily limit of 20% of advertising within the period from 7:00 to 23:00 should be introduced.deleted
2016/10/27
Committee: CULT
Amendment 270 #
Proposal for a directive
Recital 28
(28) An important sharSome of the content stored on video-sharing platforms is not under the editorial responsibility of the video-sharing platform provider. However, those providers typically determine the organisation of the content, namely programmes or user-generated videos, including by automatic means or algorithms. Therefore, those providers should be required to take appropriate measures to protect minors from content that may impair their physical, mental or moral development and protect all citizens from incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, race, colour, religion, descent or national or ethnic origin, as well as to protect all citizens from content justifying terrorism and containing incitement to intentional and continuous persecution directed against an individual or a group of individuals.
2016/10/27
Committee: CULT
Amendment 289 #
Proposal for a directive
Recital 30
(30) It is appropriate to involve the video-sharing platform providers as much as possible when implementing the appropriate measures to be taken pursuant to this Directive. Co-regulation should therefore be encouraged. With a view to ensuring a clear and consistent approach in this regard across the Union, Member States should not be entitled to require video-sharing platform providers to take stricter measures to protect minors from harmful content and all citizens from content containing incitement to violence or hatred than the ones provided for in this Directive. However, iIt should remain possible for Member States to take such stricter measures where that content is illegal, provided that they comply with Articles 14 and 15 of Directive 2000/31/EC, and to take measures with respect to content on websites containing or disseminating child pornography, as required by and allowed under Article 25 of Directive 2011/93/EU of the European Parliament and the Council35 . It should also remain possible for video-sharing platform providers to take stricter measures on a voluntary basis. __________________ 35 Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA (OJ L 335, 17.12.2011, p. 1).
2016/10/27
Committee: CULT
Amendment 365 #
Proposal for a directive
Article 1 – paragraph 1 – point 1 – point b
Directive 2010/13/EU
Article 1 – paragraph 1 – point a a – point i
(i) the service consists of the storage of a large amount of programmes or user- generated videos, for over the selection of which the video- sharing platform provider does not have editorial responsibilityexercise effective control;
2016/10/27
Committee: CULT
Amendment 408 #
Proposal for a directive
Article 1 – paragraph 1 – point 2
Directive 2010/13/EU
Chapter II - Title
(2) the title of Chapter II is replaced by the following: ‘ GENERAL PROVISIONS FOR AUDIOVISUAL MEDIA SERVICES; ’deleted
2016/10/27
Committee: CULT
Amendment 409 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article –2 (new)
(2a) The following article is inserted: 'Article -2 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate, effective and efficient measures to: (a) protect all citizens from programmes ad user-generated videos containing incitement to violence or hatred directed against a group of individuals or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect all citizens from programmes and user-generated videos containing incitement to commit terrorist acts and justifying terrorism; (c) protect all citizens from programmes and user-generated videos containing intentional and continuous persecution directed against an individual or a group of individuals; (d) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. Such content shall, in any case, only be made available in such a way as to ensure that minors will not normally hear or see it. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on-demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2. Such complaint and redress mechanisms shall ensure the effective and permanent removal of contents referred to in paragraph 1. 4. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation. Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraphs 2 and 3 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 30. In case co-regulation or self-regulation prove to be ineffective, national regulatory bodies are entitled to exercise their effective powers.'
2016/10/27
Committee: CULT
Amendment 410 #
Proposal for a directive
Article 1 – paragraph 1 – point 2 a (new)
Directive 2010/13/EU
Chapter II – Article –2 (new)
(2a) The following article is inserted: 'Article -2 1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that media service providers and video-sharing platform providers under their jurisdiction take appropriate measures to: (a) protect all citizens from programmes and user-generated videos containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, racial or ethnic origin, religion or belief, disability, age or sexual orientation, descent or national origin; (b) protect minors from programmes or user-generated videos which may impair their physical, mental or moral development. The most harmful content, such as gratuitous violence or pornography, shall not be included in television broadcasts by broadcasters and, in the case of on demand media services, shall be subject to the strictest measures, such as encryption and effective parental controls. Such measures shall include selecting the time of their availability, age verification tools or other technical measures, including parental control tools by default. Such content shall in any case only be made available in such a way as to ensure that minors will not normally hear or see it. 2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, shall be proportionate to the potential harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the providers and the users having created and/or uploaded the content as well as the public interest and respect communicative freedoms. Providers shall provide sufficient information to viewers about such content, preferably using a system of descriptors indicating the nature of the content. 3. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation as provided for in Article -2f(3) and (4). Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraph 2 of this Article. Member States shall entrust this task to the bodies designated in accordance with Article 29. When adopting such measures the Member States shall respect the conditions set by applicable Union law, in particular Articles 14 and 15 of Directive 2000/31/EC or Article 25 of Directive 2011/93/EU. 4. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between recipients of a service and media service providers or video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2.'
2016/10/27
Committee: CULT
Amendment 571 #
Proposal for a directive
Article 1 – paragraph 1 – point 8 – introductory part
Directive 2010/13/EU
Article 6
(8) Article 6 is replaced by the following:deleted
2016/10/27
Committee: CULT
Amendment 581 #
Proposal for a directive
Article 1 – paragraph 1 – point 9
Directive 2010/13/EU
Article 6 a
(9) the following Article 6a is inserted: ‘Article 6a 1. audiovisual media service providers provide sufficient information to viewers about content which may impair the physical, mental or moral development of minors. For this purpose, Member States may use a system of descriptors indicating the nature of the content of an audiovisual media service. 2. Article, Member States shall encourage co-regulation. 3. encourage media service providers to exchange best practices on co-regulatory systems across the Union. Where appropriate, the Commission shall facilitate the development of Union codes of conduct.;’deleted Member States shall ensure that For the implementation of this The Commission and ERGA shall
2016/10/27
Committee: CULT
Amendment 705 #
Proposal for a directive
Article 1 – paragraph 1 – point 14 – introductory part
Directive 2010/13/EU
Article 12
(14) Article 12 is replaced by the following and moved to Chapter III:deleted
2016/10/27
Committee: CULT
Amendment 777 #
Proposal for a directive
Article 1 – paragraph 1 – point 16
Directive 2010/13/EU
Article 20 – paragraph 2
(16) In Article 20, paragraph 2, the first sentence is replaced by the following: ‘The transmission of films made for television (excluding series, serials and documentaries), cinematographic works and news programmes may be interrupted by television advertising and/or teleshopping once for each scheduled period of at least 20 minutes.’deleted
2016/10/27
Committee: CULT
Amendment 795 #
Proposal for a directive
Article 1 – paragraph 1 – point 17
Directive 2010/13/EU
Article 23
(17) Article 23 is replaced by the following: ‘Article 23 1. advertising spots and teleshopping spots within the period between 7:00 and 23:00 shall not exceed 20 %. 2. (a) broadcaster in connection with its own programmes and ancillary products directly derived from those programmes or with programmes from other entities belonging to the same media group; (b) (c)deleted The daily proportion of television Paragraph 1 shall not apply to: announcements made by the sponsorship announcements; product placements;’
2016/10/27
Committee: CULT
Amendment 839 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a
‘Article 28a 1. and 15 of Directive 2000/31/EC, Member States shall ensure that video-sharing platform providers take appropriate measures to: (a) protect minors from content which may impair their physical, mental or moral development; (b) protect all citizens from content containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, race, colour, religion, descent or national or ethnic origin. 2. measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, the harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the video- sharing platform providers and the users having created and/or uploaded the content as well as the public interest. Those measures shall consist of, as appropriate: (a) defining and applying in the terms and conditions of the video-sharing platform providers the concepts of incitement to violence or hatred as referred to in point (b) of paragraph 1 and of content which may impair the physical, mental or moral development of minors, in accordance with Articles 6 and 12 respectively; (b) establishing and operating mechanisms for users of video-sharing platforms to report or flag to the video- sharing platform provider concerned the content referred to in paragraph 1 stored on its platform; (c) verification systems for users of video- sharing platforms with respect to content which may impair the physical, mental or moral development of minors; (d) allowing users of video-sharing platforms to rate the content referred to in paragraph 1; (e) systems with respect to content which may impair the physical, mental or moral development of minors; (f) through which providers of video-sharing platforms explain to users of video- sharing platforms what effect has been given to the reporting and flagging referred to in point (b). 3. implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation as provided for in Article 4(7). 4. necessary mechanisms to assess the appropriateness of the measures referred to in paragraphs 2 and 3 taken by video- sharing platform providers. Member States shall entrust this task to the authorities designated in accordance with Article 30. 5. video-sharing platform providers measures that are stricter than the measures referred to in paragraph 1 and 2. Member States shall not be precluded from imposing stricter measures with respect to illegal content. When adopting such measures, they shall respect the conditions set by applicable Union law, such as, where appropriate, those set in Articles 14 and 15 of Directive 2000/31/EC or Article 25 of Directive 2011/93/EU. 7. encourage video-sharing platform providers to exchange best practices on co-regulatory systems across the Union. Where appropriate, the Commission shall facilitate the development of Union codes of conduct. 8. Video-sharing platform providers or, where applicable, the organisations representing those providers in this respect shall submit to the Commission draft Union codes of conduct and amendments to existing Union codes of conduct. The Commission may request ERGA to give an opinion on the drafts, amendments or extensions of those codes of conduct. The Commission may give appropriate publicity to those codes of conduct.’deleted Without prejudice to Articles 14 What constitutes an appropriate establishing and operating age establishing and operating systems providing for parental control establishing and operating systems For the purposes of the Member States shall establish the Member States shall not impose on The Commission and ERGA shall
2016/10/27
Committee: CULT
Amendment 840 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 1
1. Without prejudice to Articles 14 and 15 of Directive 2000/31/EC, Member States shall ensure that video-sharing platform providers take appropriate measures to: (a) protect minors from content which may impair their physical, mental or moral development; (b) containing incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to sex, race, colour, religion, descent or national or ethnic origin.deleted protect all citizens from content
2016/10/27
Committee: CULT
Amendment 870 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 2
2. What constitutes an appropriate measure for the purposes of paragraph 1 shall be determined in light of the nature of the content in question, the harm it may cause, the characteristics of the category of persons to be protected as well as the rights and legitimate interests at stake, including those of the video- sharing platform providers and the users having created and/or uploaded the content as well as the public interest. Those measures shall consist of, as appropriate: (a) and conditions of the video-sharing platform providers the concepts of incitement to violence or hatred as referred to in point (b) of paragraph 1 and of content which may impair the physical, mental or moral development of minors, in accordance with Articles 6 and 12 respectively; (b) mechanisms for users of video-sharing platforms to report or flag to the video- sharing platform provider concerned the content referred to in paragraph 1 stored on its platform; (c) verification systems for users of video- sharing platforms with respect to content which may impair the physical, mental or moral development of minors; (d) allowing users of video-sharing platforms to rate the content referred to in paragraph 1; (e) providing for parental control systems with respect to content which may impair the physical, mental or moral development of minors; (f) through which providers of video-sharing platforms explain to users of video- sharing platforms what effect has been given to the reporting and flagging referred to in point (b).deleted defining and applying in the terms establishing and operating establishing and operating age establishing and operating systems establishing and operating systems
2016/10/27
Committee: CULT
Amendment 908 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 3
3. For the purposes of the implementation of the measures referred to in paragraphs 1 and 2, Member States shall encourage co-regulation as provided for in Article 4(7).deleted
2016/10/27
Committee: CULT
Amendment 915 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
4. Member States shall establish the necessary mechanisms to assess the appropriateness of the measures referred to in paragraphs 2 and 3 taken by video- sharing platform providers. Member States shall entrust this task to the authorities designated in accordance with Article 30.deleted
2016/10/27
Committee: CULT
Amendment 925 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 5
5. Member States shall not impose on video-sharing platform providers measures that are stricter than the measures referred to in paragraph 1 and 2. Member States shall not be precluded from imposing stricter measures with respect to illegal content. When adopting such measures, they shall respect the conditions set by applicable Union law, such as, where appropriate, those set in Articles 14 and 15 of Directive 2000/31/EC or Article 25 of Directive 2011/93/EU.deleted
2016/10/27
Committee: CULT
Amendment 934 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 6
6. Member States shall ensure that complaint and redress mechanisms are available for the settlement of disputes between users and video-sharing platform providers relating to the application of the appropriate measures referred to in paragraphs 1 and 2.deleted
2016/10/27
Committee: CULT
Amendment 938 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 7
7. The Commission and ERGA shall encourage video-sharing platform providers to exchange best practices on co-regulatory systems across the Union. Where appropriate, the Commission shall facilitate the development of Union codes of conduct.deleted
2016/10/27
Committee: CULT
Amendment 944 #
Proposal for a directive
Article 1 – paragraph 1 – point 19
Directive 2010/13/EU
Article 28 a – paragraph 8
8. Video-sharing platform providers or, where applicable, the organisations representing those providers in this respect shall submit to the Commission draft Union codes of conduct and amendments to existing Union codes of conduct. The Commission may request ERGA to give an opinion on the drafts, amendments or extensions of those codes of conduct. The Commission may give appropriate publicity to those codes of conduct.deleted
2016/10/27
Committee: CULT
Amendment 1042 #
Proposal for a directive
Article 1 – paragraph 1 – point 23
Directive 2010/13/EU
Article 33 – paragraph 2
By [date – no later than four years after adoption] at the latest, and every three years thereafter, the Commission shall submit to the European Parliament, to the Council and to the European Economic and Social Committee a report on the application of this Directive in all Member States, as well as a report on practices, policies and actions supported by Member States in the field of media literacy.
2016/10/27
Committee: CULT