Progress: Procedure completed
Role | Committee | Rapporteur | Shadows |
---|---|---|---|
Lead | CULT | COSTA Silvia ( S&D) | ZANICCHI Iva ( PPE), SCHAAKE Marietje ( ALDE), BENARAB-ATTOU Malika ( Verts/ALE) |
Committee Opinion | IMCO | ||
Committee Opinion | LIBE | HEDH Anna ( S&D) | |
Committee Opinion | ITRE |
Lead committee dossier:
Legal Basis:
RoP 54
Legal Basis:
RoP 54Subjects
Events
The European Parliament adopted a resolution on protecting children in the digital world.
Parliament notes the fact that almost 15 % of internet users who are minors aged between 10 and 17 receive some form of sexual solicitation, and 34 % of them encounter sexual material that they have not searched for. Minors must be protected from the dangers of the digital world in accordance with their age and developmental progress.
It considers, in this context, the measures taken by Member States to prevent illegal online content are not always effective and inevitably involve differing approaches to the prevention of content that is harmful to children.
According to Parliament, the protection of minors in the digital world must be addressed at regulatory level by deploying more effective measures, including through self-regulation by engaging the industry to assume its shared responsibility, and at educational and training level by training children, parents and teachers in order to prevent minors from accessing illegal content. This is why they propose a strategy that seeks to strike the right balance between free access to the internet and combating illegal content.
A framework of rights and governance: Parliament points out that a new stage of protecting the rights of the child in the EU framework started with the entry into force of the Treaty of Lisbon, together with the now legally binding Charter of Fundamental Rights of the European Union, whose Article 24 defines the protection of children as a fundamental right. It reiterates the need for the EU to fully respect the standards of the relevant international instruments and urge the Member States to transpose and implement, in a smooth and timely manner, all legal instruments in the area of the protection of minors in the digital world.
Parliament welcomes the Commission’s European strategy for a better internet for children and calls on the Commission to enhance existing internal mechanisms to ensure a consistent and coordinated approach to child safety online, underlining that only a comprehensive combination of legal, technical and educational measures , including prevention, can adequately address the dangers that children face online , and enhance the protection of children in the online environment.
In this context, several measures are recommended:
the continuation of the Safer Internet Programme , with adequate funding to carry out its activities; research and education programmes aimed at reducing the risk of children becoming victims of the internet; close collaboration with civil society associations and organisations working inter alia for the protection of minors, data protection and education.
Media and new media: access and education: pointing out that the internet provides children and young people with immensely valuable tools, which can be used to express or assert their views, access information and learning and claim their rights, as well as being an excellent tool of communication, Parliament also highlights the inherent risks for the most vulnerable users: child pornography, the exchange of material on violence, cybercrime, intimidation, bullying, grooming, children being able to access or acquire legally restricted or age-inappropriate goods and services, exposure to age-inappropriate, aggressive or misleading advertising, scams, etc.
It underlines that the new information and communication options offered by the digital world, such as computers, TV on different platforms, mobile phones, video games, tablets, apps, and the level of diffusion of different media that converge in a single digital system, entail not only a host of possibilities and opportunities for children and adolescents, but also risks in terms of easy access to content that is illegal, unsuitable or harmful to the development of minors, as well as the possibility that data may be collected with the aim of targeting children as consumers, with harmful, unmeasured effects.
Parliament supports Member States’ efforts to promote systematic education and training for children (from an early age), parents, educators, schoolteachers and social workers, aimed at enabling them to understand the digital world and identify the associated dangers. To this end, it encourages ongoing digital training for educators who work with students in schools on a permanent basis.
It also highlights the role of parents and of the family and urges the Commission to support awareness-raising initiatives aimed at parents and educators.
It also points to the role of the private sector and industry as regards their responsibility in relation to these issues as well as child-safe labelling for web pages, and promotion of ‘netiquette’ for children. In this context, it urges the Commission to include in its main priorities the protection of children from aggressive or misleading TV and online advertising . Special attention must be given to online marketing of harmful substances, such as alcohol, given that social networks facilitate the online marketing of this product.
Right to protection: Parliament outlines its vision of protecting children from the dangers of the internet. It focuses on the following measures:
1) Combating illegal content: in this regard, Parliament calls for:
the collection, in the framework of its reporting obligation on the transposition of Directive 2011/92/EU , of exact and clear data on the crime of online grooming ; further improvement could be achieved in connection with further harmonisation of the criminal law and criminal procedures of the Member States, including eventual proposals for material EU criminal legislation that fully respect the principles of subsidiarity and proportionality; the strengthened cooperation with third countries as regards the prompt deletion of web pages containing or disseminating illegal content, as well as the combat of cybercrime; the introduction and strengthening of hotline systems for reporting crimes and illegal content and conduct, respecting the rights of suspects and the improved information for children and families regarding national hotlines and other contact points such as “safety buttons”; strengthened international cooperation between law enforcement agencies and the development of synergies with other related services, including police and juvenile justice systems; the dissemination of reliable instruments, such as warning pages or acoustic and optical signals to limit direct access of minors to content that is harmful to them; a stronger commitment from digital content and service suppliers to implement codes of conduct compliant with the regulations that are in force, to identify, prevent and remove illegal content based on the decisions of the legal authorities; the launch of a campaign addressed at parents to assist them in understanding the digital material that is being managed by their children; the proper implementation by Member States of the existing procedural rules for deleting websites hosting exploitative, threatening, abusive, discriminatory or otherwise malicious content; the consideration of possible legislative measures if industry self-regulation fails to deliver.
Parliament also regrets the failure to comply with the pact signed on 9 February 2009 between the Commission and 17 social networking sites, including Facebook and Myspace, which promoted the protection and security of minors online.
2) Combating harmful content: Parliament considers it urgent for the Commission to examine the effectiveness of the various systems for voluntary classification of content unsuitable for minors in the Member States and calls on it, as well as the Member States and the internet industry, to reinforce cooperation in the development of strategies and standards to train minors in the responsible use of the internet .
The following measures are recommended:
the integration of the protection of minors into the respective by-laws associations of audiovisual and digital service suppliers the harmonisation by the Member States of the classification of digital content for minors (e.g. games by age-group), in cooperation with the relevant operators and associations, and with third countries; the establishment of the ‘European Framework for Safer Mobile Use’ by exploiting the options that facilitate parental control.
3) Protection of privacy: although Parliament welcomes the new proposal for a Regulation on personal data protection and its special provisions on children’s consent and the right to be forgotten, which bans the preservation online of information on the personal data of minors, which may pose a risk to their personal and professional life, it calls for further clarification. It considers that owners and administrators of web pages should indicate in a clear and visible way their data protection policy and should provide for a system of mandatory parental consent for the processing of data of children under the age of 13 . It favours ensuring that users have more information on how their personal data (and that of associated parties) are handled and consider that this information should be made available in a language and form adapted to the user profiles.
It calls for the promotion in every digital sector of technological options which, if selected, can limit the websurfing of minors within traceable limits and with conditional access, thereby providing an effective tool for parental control.
4) Right of reply in digital media: Parliament calls for the development and harmonisation of systems relating to the right of reply in digital media.
Right to digital citizenship: given the impact of digital technology as an important learning tool for citizenship, Parliament calls on the Member States to consider digital platforms as training tools for democratic participation for every child. Measures would have to be taken to take into account the most vulnerable. Lastly, it recalls that information and citizenship are closely linked on the internet and that what threatens the civic engagement of young people today is the lack of interest they show in information.
The Committee on Culture and Education adopted the own-initiative report by Silvia COSTA (S&D, IT) on protecting children in the digital world.
Members note the fact that almost 15 % of internet users who are minors aged between 10 and 17 receive some form of sexual solicitation, and 34 % of them encounter sexual material that they have not searched for. They consider, in this context, the measures taken by Member States to prevent illegal online content are not always effective and inevitably involve differing approaches to the prevention of content that is harmful to children.
According to Members, the protection of minors in the digital world must be addressed at regulatory level by deploying more effective measures, including through self-regulation by engaging the industry to assume its shared responsibility, and at educational and training level by training children, parents and teachers in order to prevent minors from accessing illegal content. This is why they propose a strategy that seeks to strike the right balance between free access to the internet and combating illegal content.
A framework of rights and governance: Members point out that a new stage of protecting the rights of the child in the EU framework started with the entry into force of the Treaty of Lisbon, together with the now legally binding Charter of Fundamental Rights of the European Union, whose Article 24 defines the protection of children as a fundamental right. They reiterate the need for the EU to fully respect the standards of the relevant international instruments and urge the Member States to transpose and implement, in a smooth and timely manner, all legal instruments in the area of the protection of minors in the digital world.
Members welcome the Commission ’ s European strategy for a better internet for children and call on the Commission to enhance existing internal mechanisms to ensure a consistent and coordinated approach to child safety online, underlining that only a comprehensive combination of legal, technical and educational measures , including prevention, can adequately address the dangers that children face online , and enhance the protection of children in the online environment.
In this context, several measures are recommended:
the continuation of the Safer Internet Programme , with adequate funding to carry out its activities; research and education programmes aimed at reducing the risk of children becoming victims of the internet; close collaboration with civil society associations and organisations working inter alia for the protection of minors, data protection and education.
Media and new media: access and education: pointing out that the internet provides children and young people with immensely valuable tools, which can be used to express or assert their views, access information and learning and claim their rights, as well as being an excellent tool of communication, Members also highlight the inherent risks for the most vulnerable users: child pornography, the exchange of material on violence, cybercrime, intimidation, bullying, grooming, children being able to access or acquire legally restricted or age-inappropriate goods and services, exposure to age-inappropriate, aggressive or misleading advertising, scams, etc.
They, therefore, support Member States ’ efforts to promote systematic education and training for children (from an early age), parents, educators, schoolteachers and social workers, aimed at enabling them to understand the digital world and identify the associated dangers. To this end, they encourage ongoing digital training for educators who work with students in schools on a permanent basis.
Members also highlight the role of parents and of the family and urge the Commission to support awareness-raising initiatives aimed at parents and educators in order to ensure that they can best support minors in the use of digital tools and services.
They highlight, in particular, the role of the private sector and industry as regards their responsibility in relation to these issues as well as child-safe labelling for web pages, and promotion of ‘ netiquette ’ for children. In this context, they urge the Commission to include in its main priorities the protection of children from aggressive or misleading TV and online advertising .
Right to protection: Members outline their vision of protecting children from the dangers of the internet. The report focuses on the following measures:
1) Combating illegal content: in this regard, Members call for:
the collection, in the framework of its reporting obligation on the transposition of Directive 2011/92/EU , of exact and clear data on the crime of online grooming ; further improvement could be achieved in connection with further harmonisation of the criminal law and criminal procedures of the Member States, including eventual proposals for material EU criminal legislation that fully respect the principles of subsidiarity and proportionality; the strengthened cooperation with third countries as regards the prompt deletion of web pages containing or disseminating illegal content; the introduction and strengthening of hotline systems for reporting crimes and illegal content and conduct, respecting the rights of suspects and the improved information for children and families regarding national hotlines and other contact points such as “safety buttons”; strengthened international cooperation between law enforcement agencies and the development of synergies with other related services, including police and juvenile justice systems; the dissemination of reliable instruments, such as warning pages or acoustic and optical signals to limit direct access of minors to content that is harmful to them; a stronger commitment from digital content and service suppliers to implement codes of conduct compliant with the regulations that are in force, to identify, prevent and remove illegal content based on the decisions of the legal authorities; the proper implementation by Member States of the existing procedural rules for deleting websites hosting exploitative, threatening, abusive, discriminatory or otherwise malicious content.
Members also regret the failure to comply with the pact signed on 9 February 2009 between the Commission and 17 social networking sites, including Facebook and Myspace, which promoted the protection and security of minors online.
2) Combating harmful content: Members consider it urgent for the Commission to examine the effectiveness of the various systems for voluntary classification of content unsuitable for minors in the Member States and call on it, as well as the Member States and the internet industry, to reinforce cooperation in the development of strategies and standards to train minors in the responsible use of the internet .
The following measures are recommended:
the integration of the protection of minors into the respective by-laws associations of audiovisual and digital service suppliers the harmonisation by the Member States of the classification of digital content for minors (e.g. games by age-group), in cooperation with the relevant operators and associations, and with third countries; the establishment of the ‘ European Framework for Safer Mobile Use ’ by exploiting the options that facilitate parental control.
3) Protection of privacy:
Although Members welcome the new proposal for a Regulation on personal data protection and its special provisions on children ’ s consent and the right to be forgotten, which bans the preservation online of information on the personal data of minors, which may pose a risk to their personal and professional life, they call for further clarification. They consider that owners and administrators of web pages should indicate in a clear and visible way their data protection policy and should provide for a system of mandatory parental consent for the processing of data of children under the age of 13 . Members favour ensuring that users have more information on how their personal data (and that of associated parties) are handled and consider that this information should be made available in a language and form adapted to the user profiles.
They call for t he promotion in every digital sector of technological options which, if selected, can limit the websurfing of minors within traceable limits and with conditional access, thereby providing an effective tool for parental control.
4) Right of reply in digital media: Members call for the development and harmonisation of systems relating to the right of reply in digital media.
Right to digital citizenship: given the impact of digital technology as an important learning tool for citizenship, Members call on the Member States to consider digital platforms as training tools for democratic participation for every child. Measures would have to be taken to take into account the most vulnerable. They recall that information and citizenship are closely linked on the internet and that what threatens the civic engagement of young people today is the lack of interest they show in information. Lastly, it should be noted that in accordance with Rule 52(3) of Parliament’s Rules of Procedure, a minority opinion was tabled in the context of this report by several members of the ALDE group.
The Members in question reject the report’s focus on government campaigns and the extension of enforcement to ISPs and other self-regulating mechanisms which, in their view, diminishes the role of parents in their children’s education. They consider that the measures included in the report furthermore demonstrate an unwarranted bias towards the perceived dangers of the Internet, limiting the opportunities for education and innovation.
The minority opinion recommends rather that youth’s resilience and independence should be strengthened and that efforts should be directed at educating children and youth and developing their e-skills.
PURPOSE: to analyse the implementation and effectiveness of the measures in Council Recommendations 98/560/EC and 2006/952/EC on Protection of Minors.
CONTENT: the objective of the 1998 and the 2006 Recommendations on Protection of Minors was to make Member States and industry conscious of the new challenges for the protection of minors in electronic media, particularly those linked to the uptake and growing importance of online services. This report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. It also ask the question whether current policies are still suitable and adequate to ensure a high level of protection for minors throughout Europe.
The report discusses Member States’ reports and states that as a positive general result, the survey of Member States on the various dimensions of the 1998 and 2006 Recommendations shows that all Member States are conscious of the challenges for the protection of minors online and are increasingly making efforts to respond to them. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible and responsive a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content.
However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse – and in a number of cases, even diverging - actions across Europe. This is in particular true in certain areas:
Tackling illegal and harmful content : while there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly.
Going forward, existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. For instance, reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency.
Making social networks safer places : whilst social networking sites offer huge opportunities for minors, they also bear a considerable risk potential, which can be summarised by the categories "illegal content", "age-inappropriate content", "inappropriate contact" and "inappropriate conduct". One promising way to counter these risks is guidelines, addressing providers of social networking sites and/or users.
Only 10 Member States referred to such guidelines, and even fewer reported that there are evaluation systems in place to assess their effectiveness. Therefore, "soft law" rules currently suffer from rather patchy implementation.
Given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient and consistent manner.
Active stakeholder engagement is encouraged, in particular through further awareness- raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring.
In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis.
Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread.
Streamlining content rating schemes : 16 Member States and Norway responded that they have diverging age ratings and classifications for different types of media. Ten Member States and Norway consider this to be a problem. Eight Member States and Norway point out that there are measures or initiatives being considered to introduce greater consistency in this field.
Altogether 15 Member States and Norway consider cross-media and/or pan-European classification systems for media content helpful and feasible. This is contradicted by nine Member States which point to the cultural differences. This is an area of most extreme fragmentation – the conceptions of what is necessary and useful diverge significantly between and within Member States.
While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature
e of online content, ways to better align such systems should be explored further.
Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States.
Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of “appropriateness” and reflecting the established approaches to the liability of the various Internet actors.
Danger of market fragmentation : the report goes on to point out that quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers who try to identify the “do's” and “don't” to protect and empower children who go online.
This report and the detailed responses gathered in the survey of Member States demonstrate that further action at European level may build on the best practices of the Member States and reach economies of scale for the ICT sector that will help children to safely reap the benefits of the constantly evolving digital world.
The Commission presents report on the application of Council Recommendation 98/560/EC concerning the protection of minors and human dignity and of Recommendation 2006/952/EC of the European Parliament and of the Council on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry. In accordance with the requirements of the 2006 Recommendation, the report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. The findings may be summarised as follows:
Tackling illegal or harmful content : content and service providers are increasingly making efforts to tackle discriminating and other illegal or harmful content, particularly through self-commitments / codes of conduct, which exist in 24 Member States. As far as Internet content is concerned, some of these initiatives ensure that websites may signal their compliance with a code of conduct by displaying an appropriate label. Efforts are also made to develop access to appropriate content for minors, for instance through specific websites for children and specific search engines
While there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. Reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency.
Hotlines : the widespread establishment and networking of hotlines is encouraging, but not sufficient. In order to foster both their efficiency and more consistency amongst Member States (e.g. best practices of interactions with law enforcement authorities), ways to make them more easily accessible and to improve their functioning and develop synergies with other related services (e.g. Helplines and Awareness Centres, 116 000/116 111 numbers) should be reflected on.
Internet Service Providers (ISPs) : ISPs are increasingly involved in the protection of minors, despite their limited liability and responsibility under the E-Commerce-Directive (Directive 2000/31/EC). This applies to their legal obligations regarding illegal content, but particularly to joint voluntary commitments and adherence to codes of conduct. However, ISP associations generally have no specific mandate regarding the protection of minors. Therefore, signature and compliance with codes of conduct for the protection of minors is generally only optional for members of such associations. ISPs are encouraged to become more active in the protection of minors. The application of codes of conduct should be more widespread and closely monitored. ISP associations are encouraged to include protection of minors in their mandates and commit their members accordingly. Moreover, greater involvement of consumers and authorities in the development of codes of conduct would help to ensure that self-regulation truly responds to the rapidly evolving digital world.
Social networking sites : given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient manner. Active stakeholder engagement is encouraged, in particular through further awareness-raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default” settings for children joining in social networking sites is not widespread.
Problematic Internet content from other Member States / from outside the EU : enhanced cooperation and harmonised protection concerning problematic Internet content seem desirable. Although this content originates mostly outside the EU, some Member States consider such an approach to be more realistic at European level than by involving third countries.
Access restrictions to content : this requires on the one hand, age rating and classifying content and on the other hand ensuring respect for these ratings and classifications. The latter task falls primarily within parents' responsibility, but technical systems - filtering, age verification systems, parental control systems - provide valued support. . This is an area of most extreme fragmentation – the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature of online content, ways to align better such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States.
Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of “appropriateness” and reflecting the established approaches to the liability of the various Internet actors.
Audiovisual Media Services : as regards co/self-regulation systems for the protection of minors from harmful content, on-demand audiovisual media services) are lagging behind television programmes where such systems are in place in 14 Member States, with 11 of them having a code of conduct in place. The variety of actions carried out in this field reflects the distinctions made in the Audiovisual Media Services Directive (Directive 2010/13/EU) but also the difficulty to reach consensual policy responses. Universally available technical means for offering children a selective access to content on the Internet, such as parental control tools linked to age-rated and labelled content are very diverse; the solutions developed for linear/TV broadcasting (e.g. transmission times) often seem ill-adapted to Internet and other on-demand audiovisual media services.
Conclusions : the survey shows that all Member States are increasingly making efforts to respond to the challenges. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse – and in a number of cases, even diverging - actions across Europe. This is particularly true of tackling illegal and harmful content, making social networks safer places and streamlining content rating schemes.
Quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers.
Documents
- Commission response to text adopted in plenary: SP(2013)110
- Results of vote in Parliament: Results of vote in Parliament
- Decision by Parliament: T7-0428/2012
- Debate in Parliament: Debate in Parliament
- Committee report tabled for plenary: A7-0353/2012
- Committee opinion: PE491.241
- Amendments tabled in committee: PE489.363
- Committee draft report: PE486.198
- Contribution: COM(2011)0556
- Follow-up document: COM(2011)0556
- Follow-up document: EUR-Lex
- Non-legislative basic document published: COM(2011)0556
- Non-legislative basic document published: EUR-Lex
- Follow-up document: COM(2011)0556 EUR-Lex
- Committee draft report: PE486.198
- Amendments tabled in committee: PE489.363
- Committee opinion: PE491.241
- Commission response to text adopted in plenary: SP(2013)110
- Contribution: COM(2011)0556
Activities
- Alexander Nuno PICKART ALVARO
Plenary Speeches (2)
- Silvia COSTA
Plenary Speeches (2)
- Amelia ANDERSDOTTER
Plenary Speeches (1)
- Roberta ANGELILLI
Plenary Speeches (1)
- Erik BÁNKI
Plenary Speeches (1)
- Andrea ČEŠKOVÁ
Plenary Speeches (1)
- Sergio Gaetano COFFERATI
Plenary Speeches (1)
- Anna Maria CORAZZA BILDT
Plenary Speeches (1)
- Pat the Cope GALLAGHER
Plenary Speeches (1)
- Louis GRECH
Plenary Speeches (1)
- Petru Constantin LUHAN
Plenary Speeches (1)
- Iosif MATULA
Plenary Speeches (1)
- Marek Henryk MIGALSKI
Plenary Speeches (1)
- Andreas MÖLZER
Plenary Speeches (1)
- Franz OBERMAYR
Plenary Speeches (1)
- Jaroslav PAŠKA
Plenary Speeches (1)
- Oreste ROSSI
Plenary Speeches (1)
- Marie-Thérèse SANCHEZ-SCHMID
Plenary Speeches (1)
- Daciana Octavia SÂRBU
Plenary Speeches (1)
- Olga SEHNALOVÁ
Plenary Speeches (1)
- Joanna Katarzyna SKRZYDLEWSKA
Plenary Speeches (1)
- Claudiu Ciprian TĂNĂSESCU
Plenary Speeches (1)
- Silvia-Adriana ȚICĂU
Plenary Speeches (1)
- Jarosław WAŁĘSA
Plenary Speeches (1)
- Josef WEIDENHOLZER
Plenary Speeches (1)
- Angelika WERTHMANN
Plenary Speeches (1)
Amendments | Dossier |
182 |
2012/2068(INI)
2012/05/10
CULT
182 amendments...
Amendment 1 #
Motion for a resolution Citation 2 bis (new) - having regard to the European Convention on Human Rights and Convention 108 of the Council of Europe,
Amendment 10 #
Motion for a resolution Recital A A. whereas the protection of minors in the digital world must be addressed at
Amendment 100 #
Motion for a resolution Paragraph 6 6. Underlines the need for an educational alliance among families, school, civil society, interested parties, media and audiovisual services, in order to guarantee a balanced dynamic between the digital world and minors;
Amendment 101 #
Motion for a resolution Paragraph 6 6. Underlines the need for an educational alliance among families, school, civil society, interested parties and audiovisual services, in order to guarantee a balanced and proactive dynamic between the digital world and minors;
Amendment 102 #
Motion for a resolution Paragraph 6 bis (new) 6a. Encourages Member States to support schools in developing the ability among children to test and produce quality online and offline products and services;
Amendment 103 #
Motion for a resolution Paragraph 6 bis (new) 6a. Calls on Member States, public authorities and access providers to intensify their communication campaigns in order to make minors, adolescents, parents and educators aware of uncontrolled digital dangers;
Amendment 104 #
Motion for a resolution Paragraph 7 Amendment 105 #
Motion for a resolution Paragraph 7 7. Encourages the Commission to support the access of minors to safe and high quality digital content in existing and new programmes and services in the digitial world, dedicated to young people
Amendment 106 #
Motion for a resolution Paragraph 7 7. Encourages the Commission to support the access of minors to
Amendment 107 #
Motion for a resolution Paragraph 7 7. Encourages the Commission to support the access of minors to safe and high quality pluralistic digital content in existing programmes, dedicated to young people, education, culture, the arts and the digital world;
Amendment 108 #
Motion for a resolution Paragraph 7 7. Encourages the Commission and member states to support the access of minors to safe and high quality digital content in existing programmes, dedicated to young people, education and the digital world;
Amendment 109 #
Motion for a resolution Paragraph 7 7. Encourages the Commission to support the equal access of minors to safe and high quality digital content in existing programmes, dedicated to young people, education and the digital world;
Amendment 11 #
Motion for a resolution Recital A A. whereas the protection of minors in the digital world must be addressed not only using measures to encourage self- regulation but also at both regulatory level, by deploying more effective instruments of prevention and repression, and on the educational level; whereas education and guidance play an extremely important role in this regard;
Amendment 110 #
Motion for a resolution Paragraph 8 Amendment 111 #
Motion for a resolution Paragraph 8 Amendment 112 #
Motion for a resolution Paragraph 8 8.
Amendment 113 #
Motion for a resolution Paragraph 8 8. Recommends that the Commission propose a review of the Audiovisual Media Services Directive
Amendment 114 #
Motion for a resolution Paragraph 8 8. Recommends that the Commission propose a review of the Audiovisual Media Services Directive, including the production of pluralistic and quality online and offline services for young people, as well as recommending to the Member States to include these objectives among the obligations of the public service in particular;
Amendment 115 #
Motion for a resolution Paragraph 8 8. Recommends that the Commission propose a review of the Audiovisual Media Services Directive, including the production of secure and quality online and offline services for young people, as well as recommending to the Member States to include these objectives among the obligations of the public service;
Amendment 116 #
Motion for a resolution Paragraph 8 a (new) 8 a. Urges the Commission to include in its main priorities the protection of children from agressive or misleading TV and online advertising; calls on the industry to respect and fully implement existing codes of conduct and similar initiatives;
Amendment 117 #
Motion for a resolution Paragraph 9 9. Highlights the effectiveness of formal, informal, non-formal and peer education in the diffusion of safe practices and the potential threats (through concrete examples) among minors in using the Internet, social networks, video games and mobile telephones, and encourages ‘European Schoolnet’ to facilitate mentoring among students in this field; stresses the need of informing also parents about safe practices and threats;
Amendment 118 #
Motion for a resolution Paragraph -10 a (new) -10 a. Welcomes the intention of the European Commission to consider possible legislative measures if industry self-regulation fails to deliver;
Amendment 119 #
Motion for a resolution Paragraph 9 a (new) 9 a. Calls on the Commission and member states to develop schemes aimed at equipping children and young people with adequate skills and securing informed access to the internet and new media for them. In this regard, highlights the importance of making digital media literacy mainstreamed at all levels of formal and non-formal education, including a lifelong learning approach from the earliest stage possible;
Amendment 12 #
Motion for a resolution Recital A A. whereas the protection of minors in the digital world must be addressed at both regulatory level, by deploying more effective instruments of prevention and repression, and on the educational level by training children, parents and teachers in order to prevent minors from accessing illegal content;
Amendment 120 #
Motion for a resolution Paragraph 10 10.
Amendment 121 #
Motion for a resolution Paragraph 10 10.
Amendment 122 #
Motion for a resolution Paragraph 10 10. Deplores the slowness in the ‘notice and take down’ procedure in some Member States, questions the reason for this, and welcomes the Commission’s initiative in publishing an impact assessment in this regard;
Amendment 123 #
Motion for a resolution Paragraph 10 10. Deplores the slowness in the ‘notice and take down’ procedure in some Member States
Amendment 124 #
Motion for a resolution Paragraph 11 Amendment 125 #
Motion for a resolution Paragraph 11 11.
Amendment 126 #
Motion for a resolution Paragraph 11 11. Invites the Commission
Amendment 127 #
Motion for a resolution Paragraph 11 11. Invites the Commission and the Member States to strengthen cooperation
Amendment 128 #
Motion for a resolution Paragraph 11 11. Invites the Commission and the Member States to strengthen cooperation with the police to protect minors against online crimes
Amendment 129 #
Motion for a resolution Paragraph 11 11. Invites the Commission and the Member States to strengthen cooperation with the police to protect minors against online crimes as well as to coordinate
Amendment 13 #
Motion for a resolution Recital A A. whereas the protection of minors in the digital world must be addressed at both regulatory level, by deploying more effective instruments of prevention and repression
Amendment 130 #
Motion for a resolution Paragraph 11 11. Invites the Commission and the Member States to strengthen cooperation with the police to protect minors against online crimes as well as to coordinate hotlines, and to make agreements with Internet services suppliers to this end; Calls on the Commission and the Member States to work in cooperation with third countries to prevent illegal activity;
Amendment 131 #
Motion for a resolution Paragraph 12 12. Encourages Member States to take forward national hotlines and other contact points, such as ‘safety buttons’, that conform to the INHOPE standard
Amendment 132 #
Motion for a resolution Paragraph 12 bis (new) 12a. Underlines the importance of disseminating reliable instruments such as warning pages or acoustic and optical signals to limit direct access of minors to content that is harmful to them;
Amendment 133 #
Motion for a resolution Paragraph 13 13. Asks the Commission and the Member States to improve information regarding hotlines and other contact points, such as ‘safety buttons’ for minors and their families;
Amendment 134 #
Motion for a resolution Paragraph 13 13. Asks the Commission and the Member States to improve information regarding hotlines for minors and their families, thereby making it easier to report illegal content;
Amendment 135 #
Motion for a resolution Paragraph 13 13. Asks the Commission and the Member States to improve information regarding hotlines for minors and their families and calls for the Member States to raise awareness of the existence of Hotlines as points of contact for reporting child sexual abuse images;
Amendment 136 #
Motion for a resolution Paragraph 14 Amendment 137 #
Motion for a resolution Paragraph 14 14. Supports the commitment of digital
Amendment 138 #
Motion for a resolution Paragraph 14 14. Supports the commitment of digital content and service suppliers to implement codes of conduct to identify, prevent and
Amendment 139 #
Motion for a resolution Paragraph 14 α (new) Amendment 14 #
Motion for a resolution Recital A a (new) A a. whereas there is a need to address all forms of illegal online content, the specificity of child sexual abuse must be recognised as not only is this content illegal but it is also one of the most abhorrent forms of content available online;
Amendment 140 #
Motion for a resolution Paragraph 14 bis (new) 14a. Regrets the failure to comply with the pact signed on 9 February 2009 between the EC and 17 social networking sites, including Facebook and Myspace, which promoted the protection and security of minors online;
Amendment 141 #
Motion for a resolution Paragraph 14 a (new) 14a. Highlights that online crimes are often of a cross-border nature, therefore an important element in combating them should be international cooperation of existing law enforcement agencies;
Amendment 142 #
Motion for a resolution Paragraph 14 a (new) 14 a. Urges Member States and the Commission to support and launch awareness raising campaigns targeting children, parents and educators in order to provide the information necessary for the protection against cyber crime, as well as to encourage them to report suspicious websites and online behaviour;
Amendment 143 #
Motion for a resolution Paragraph 14 b (new) 14 b. Urges authorities to reach agreements with content providers and server hosts to repel illegal or threatening activities in mainstream online media, most typically in social networks;
Amendment 144 #
Motion for a resolution Paragraph 14 c (new) 14 c. Calls on Member States and the Commission to properly establish the national and international procedural rules for shutting down websites hosting exploitative, threatening, abusive, discriminatory or otherwise malicious content;
Amendment 146 #
Motion for a resolution Paragraph 15 Amendment 147 #
Motion for a resolution Paragraph 15 15.
Amendment 148 #
Motion for a resolution Paragraph 15 15. Encourages the
Amendment 149 #
Motion for a resolution Paragraph 15 15. Encourages the Commission and Member States to develop strategies and standards to
Amendment 15 #
Motion for a resolution Recital A a (new) A a. whereas one of the main objectives of an effective children protection strategy should be to ensure that all children, young people and parents/carers are provided with the information and skills to be able to safeguard themselves online;
Amendment 150 #
Motion for a resolution Paragraph 15 15. Encourages the Commission and Member States to develop strategies
Amendment 151 #
Motion for a resolution Paragraph 15 15. Encourages the Commission, and Member States and the internet industry to develop strategies and standards to protect minors from online and offline exposure to content that is unsuitable for their age, including violence, advertising which encourages overspending and the purchase of virtual goods or credits with their mobile phones;
Amendment 152 #
Motion for a resolution Paragraph 15 a (new) 15a. Welcomes technical innovation whereby businesses offer special online solutions to allow children to use the Internet safely;
Amendment 153 #
Motion for a resolution Paragraph 16 16. Invites associations of audiovisual and digital service suppliers, in cooperation with other relevant associations, to integrate the protection of minors into their respective by-laws and to indicate the appropriate age group;
Amendment 154 #
Motion for a resolution Paragraph 17 17. Encourages Member States to continue the dialogue to harmonise the classification of digital content for minors, in cooperation with the relevant operators and associations;
Amendment 155 #
Motion for a resolution Paragraph 17 17. Encourages Member States to continue the dialogue to harmonise the classification of digital content for minors and to also work in cooperation with third countries;
Amendment 156 #
Motion for a resolution Paragraph 17 α (new) 17a. Encourages the Committee and the Member States to classify electronic games with distinct characters based on the age to which they are addressed and, above all, on their content
Amendment 157 #
Motion for a resolution Paragraph 18 18. Invites the Commission to continue the ‘European Framework for Safer Mobile Use’ by exploiting the options that facilitate parental control
Amendment 158 #
Motion for a resolution Paragraph 18 18.
Amendment 159 #
Motion for a resolution Paragraph 18 18. Invites the Commission to continue the ‘European Framework for Safer Mobile
Amendment 16 #
Motion for a resolution Recital B Amendment 160 #
Motion for a resolution Paragraph 18 a (new) 18 a. Stresses the good work done by civil society organisations and encourages these organisations to cooperate and work together across borders as well as working in partnership with law enforcements bodies, government, internet service providers and the public;
Amendment 161 #
Motion for a resolution Paragraph 19 Amendment 162 #
Motion for a resolution Paragraph 19 19. Welcomes the Commission’s proposal on laws to protect the privacy of minors
Amendment 163 #
Motion for a resolution Paragraph 19 19. Welcomes the Commission's proposal on laws to protect the privacy of minors such as ‘the right to be forgotten’ which bans the preservation online of information on the personal data of minors, which may risk their personal and professional life; Welcomes also the intention to establish an electronic system for age certification;
Amendment 164 #
Motion for a resolution Paragraph 19 bis (new) 19a. Underlines the importance of making users aware of how their personal data and the data of associated parties are handled by service providers or social networks and of the options available to them for redress in cases where their data are used outside the scope of the legitimate purposes for which they were collected by providers and their partners, this information to appear in a language and form adapted to the user profiles, with special attention paid to minors; considers that providers have particular responsibilities in this regard, and calls on them to inform users in a clear, comprehensible manner of their publication policies;
Amendment 165 #
Motion for a resolution Paragraph 19 a (new) 19a. Welcomes the input of the Commission in relation to the ‘right to be forgotten’, but warns against the misconception that content can ever be completely deleted from the Internet;
Amendment 166 #
Motion for a resolution Paragraph 20 Amendment 167 #
Motion for a resolution Paragraph 20 Amendment 168 #
Motion for a resolution Paragraph 20 20.
Amendment 169 #
Motion for a resolution Paragraph 20 20. Encourages the promotion in every digital sector of technological options which, if selected, can limit the websurfing of minors within traceable limits and with conditional access; thereby providing an effective tool for parental control;
Amendment 17 #
Motion for a resolution Recital B B. whereas the rapid development of technologies
Amendment 170 #
Motion for a resolution Paragraph 20 20. Encourages the promotion in every digital sector of technological options which, if selected, can limit the websurfing of minors within traceable limits and with conditional access, notes however that such measures cannot replace thorough training for minors in the use of the media;
Amendment 171 #
Motion for a resolution Paragraph 20 bis (new) 20a. Underlines the importance of informing children and adolescents at a very early stage of their rights to privacy on the Internet and teaching them to recognise the sometimes subtle methods used to obtain information from them;
Amendment 172 #
Motion for a resolution Paragraph 21 21. Invites the Member States to develop
Amendment 173 #
Motion for a resolution Paragraph 22 22. Underlines that
Amendment 174 #
Motion for a resolution Paragraph 22 22. Underlines that digital
Amendment 175 #
Motion for a resolution Paragraph 22 22. Underlines that digital citizenship is an essential element in European citizenship, in order to
Amendment 176 #
Motion for a resolution Paragraph 22 22. Underlines that digital citizenship is an essential element in European citizenship, in order to create knowledgeable citizens who are protagonists in
Amendment 177 #
Motion for a resolution Paragraph 23 23. Invites the Member States to consider digital platforms as training
Amendment 178 #
Motion for a resolution Paragraph 23 23. Invites the Member States to consider digital platforms as training grounds for democratic participation for every child with special regard to the most vulnerable;
Amendment 179 #
Motion for a resolution Paragraph 24 24. Underlines the
Amendment 18 #
Motion for a resolution Recital B B. whereas the rapid development of technologies makes prompt answers necessary
Amendment 180 #
Motion for a resolution Paragraph 24 24. Underlines the importance of promoting, in services and digital content, understanding and dialogue
Amendment 181 #
Motion for a resolution Paragraph 24 24. Underlines the importance of promoting, in services and digital content, understanding and dialogue between generations, genders, and various cultural
Amendment 182 #
Motion for a resolution Paragraph 24 bis (new) 24a. Recalls that information and citizenship are closely linked on the Internet and that what threatens the civic engagement of young people today is the lack of interest they show in information;
Amendment 19 #
Motion for a resolution Recital B B. whereas the rapid development of technologies makes prompt answers necessary through
Amendment 2 #
Motion for a resolution Citation 11 bis (new) - having regard to the Commission Communication of 28 March 2012 on ‘Tackling Crime in our Digital Age: Establishing a European Cybercrime Centre’ (COM (2012) 140 final),
Amendment 20 #
Motion for a resolution Recital C C. whereas the education sector
Amendment 21 #
Motion for a resolution Recital C C. whereas the education sector is adjusting to the digital world, but at a pace and in a way that is failing to keep up
Amendment 22 #
Motion for a resolution Recital C C. whereas the digital world provides numerous opportunities related to education and learning; whereas the education sector is adjusting to the digital world, but at a pace and in a way that is failing to keep pace with the technological changes in the lives of minors, while parents and educators are having problems in helping them as they remain on the margins of their virtual lives;
Amendment 23 #
Motion for a resolution Recital C a (new) C a. Whereas it is not only important that minors better understand the potential dangers they face online, but that families, schools and civil society must all share responsibility in educating minors and ensuring that children are properly protected when using the internet and other new media;
Amendment 24 #
Motion for a resolution Recital C bis (new) Ca. whereas education in the media and in the new information and communication technologies is important in developing policies for the protection of minors in the digital world and its role in ensuring the safe, appropriate and critical use of these technologies;
Amendment 25 #
Motion for a resolution Recital C bis (new) Ca. Considering that minors generally demonstrate great ease in using the Internet but that it is necessary to help them to use it wisely, responsibly and safely;
Amendment 26 #
Motion for a resolution Recital C a (new) C a. whereas the development of digital technologies represents a great opportunity to provide children and young people with opportunities to use new media and internet effectively in ways that empower them to share their voice with others and therefore empower them to participate and experiment an active role in society, online and offline;
Amendment 27 #
Motion for a resolution Recital D D. whereas
Amendment 28 #
Motion for a resolution Recital D D. whereas exercise of
Amendment 29 #
Motion for a resolution Recital D bis (new) Da. whereas, in addition to the fight against illegal and inappropriate content, prevention and intervention measures for the protection of minors must also cover a number of other threats such as harassment, discrimination, restriction of access to services, online surveillance, attacks on privacy and freedom of expression and information, lack of clarity regarding the aims of collecting personal data, etc.;
Amendment 3 #
Motion for a resolution Citation 12 bis (new) - having regard to European Parliament Written Declaration No 29/2010, approved by an absolute majority on 23 June 2010, on the setting up of a ‘European early warning system (EWS) for paedophiles and sex offenders’,
Amendment 30 #
Motion for a resolution Recital E Amendment 31 #
Motion for a resolution Recital E E. whereas the
Amendment 32 #
Motion for a resolution Recital E E. whereas the
Amendment 33 #
Motion for a resolution Recital E E. whereas the
Amendment 34 #
Motion for a resolution Recital E E. whereas the level of diffusion of media such as computers, TV on different platforms, mobile phones, videogames, tablets, apps, that converge in a single digital system implies risks in terms of easy access to content that is illegal, unsuitable, and harmful for the development of minors;
Amendment 35 #
Motion for a resolution Recital E E. whereas the level of diffusion of media that converge in a single digital system implies not only opportunities, but also risks in terms of easy access to content that is illegal, unsuitable, and harmful for the development of minors;
Amendment 36 #
Motion for a resolution Recital E E. whereas the level of diffusion of media that converge in a single digital system implies risks in terms of easy access to content that is illegal, unsuitable, and harmful for the development of minors; and whereas these types of content are inherently different and therefore require different solutions;
Amendment 37 #
Motion for a resolution Recital F Amendment 38 #
Motion for a resolution Recital F F. whereas, in the free circulation of the audiovisual services of the single market,
Amendment 39 #
Motion for a resolution Recital F F. whereas,
Amendment 4 #
Motion for a resolution Citation 12 a (new) - having regard to the communication of the Commission of 2 may 2012 "European Strategy for a better internet for children" (COM (2012) 196),
Amendment 40 #
Motion for a resolution Recital F F. whereas, in the free circulation of the audiovisual services of the single market, where digital service providers have differing responsibilities, the protection of minors and human dignity
Amendment 41 #
Motion for a resolution Recital F F. whereas, in the free circulation of the audiovisual services of the single market, where digital service providers have differing responsibilities, the protection of minors and human dignity
Amendment 42 #
Motion for a resolution Recital F F. whereas, in the free circulation of the audiovisual services of the single market, where digital service providers have differing responsibilities, the protection of minors and human dignity is pre-eminent;
Amendment 43 #
Motion for a resolution Recital F F. whereas, in the free circulation of the audiovisual services
Amendment 44 #
Motion for a resolution Recital G Amendment 45 #
Motion for a resolution Recital G G. whereas
Amendment 46 #
Motion for a resolution Recital G G. whereas the measures to prevent illegal online content
Amendment 47 #
Motion for a resolution Recital G G. whereas the measures to prevent illegal online content
Amendment 48 #
Motion for a resolution Recital G G. whereas the measures to prevent illegal online content lead to differing approaches to the prevention of
Amendment 49 #
Motion for a resolution Recital G G. whereas the measures taken by Member States to prevent illegal online content lead to differing approaches to the prevention of unsuitable conduct;
Amendment 5 #
Motion for a resolution Citation 16 a (new) - having regard to the European Parliament Report on a Comprehensive approach on personal data protection in the European Union 2011/2025(INI),
Amendment 50 #
Motion for a resolution Recital H H. whereas the fact that personal information and data with regard to minors remain online
Amendment 51 #
Motion for a resolution Recital H H. whereas the fact that personal information and data with regard to minors remain online can imply the illegal processing thereof, as well as harm to their personal dignity,
Amendment 52 #
Motion for a resolution Recital H H. whereas the fact that personal information and data with regard to minors remain online can imply the illegal processing thereof, as well as harm to their personal dignity, thus
Amendment 53 #
Motion for a resolution Recital H H. whereas the fact that personal information and data with regard to minors remain online can imply the illegal processing thereof, even for commercial purposes, as well as harm to their personal dignity, thus compromising their
Amendment 54 #
Motion for a resolution Recital H H. whereas the fact that personal information and data with regard to minors remain online can imply the illegal processing thereof, as well as their exploitation or harm to their personal dignity, thus compromising their identity and social inclusion;
Amendment 55 #
Motion for a resolution Recital Η α (new) Ηa. whereas the fact that rapid growth of the means of social networking entails certain dangers to security of the personal life, personal data and personal dignity of minors;
Amendment 56 #
Motion for a resolution Recital H bis (new) Ha. whereas almost 15 % of Internet users who are minors aged between 10 and 17 receive some sexual solicitation, and 34 % of them encounter sexual material that they did not search for;
Amendment 57 #
Motion for a resolution Recital I I. whereas the various
Amendment 58 #
Motion for a resolution Recital I I. whereas the various regulatory methods adopted by suppliers of digital content and services do not always satisfy requirements in respect of transparency, independence, confidentiality, the processing of personal data
Amendment 59 #
Motion for a resolution Recital I a (new) I a. whereas advertising targeted at children should be responsible and moderate;
Amendment 6 #
Motion for a resolution Recital A A. whereas the protection of minors in the digital world must be addressed at both regulatory level, by deploying more effective
Amendment 60 #
Motion for a resolution Recital J J. whereas minors must be protected from the dangers of the digital world in accordance with their age and developmental progress; whereas the Member States are reporting difficulties in coordinating aspects on the adoption of classification categories for content by age range and the risk level of the content;
Amendment 61 #
Motion for a resolution Recital J a (new) J a. Whereas, while acknowledging the many dangers that minors face in the digital world, we should also continue to embrace the many opportunities that the digital world brings in growing a knowledge based society;
Amendment 62 #
Motion for a resolution Recital J a (new) Ja. taking account of the fact that the role of parents in the process of protecting their children from the dangers stemming from the digital world is very significant; taking account of the fact that full involvement of both parents in the process of raising a child is decisive in the child’s development and the ability to defend against dangerous content, including digital content;
Amendment 63 #
Motion for a resolution Paragraph 1 Amendment 64 #
Motion for a resolution Paragraph 1 Amendment 65 #
Motion for a resolution Paragraph 1 Amendment 66 #
Motion for a resolution Paragraph 1 1. Asks the Commission to impro
Amendment 67 #
Motion for a resolution Paragraph 1 1.
Amendment 68 #
Motion for a resolution Paragraph 1 1. Asks the Commission to propose a single and clear framework directive on the rights of minors in the digital world, in order to integrate and codify all the provisions regarding minors envisaged in the previous provisions of the EU;
Amendment 69 #
Motion for a resolution Paragraph 1 1. Asks the Commission to propose a single framework directive on the rights of minors in the digital world, in order to integrate all the provisions regarding minors envisaged in the previous provisions of the EU, in compliance with the European provisions on fundamental rights and freedoms;
Amendment 7 #
Motion for a resolution Recital A A. whereas the protection of minors in the digital world must be addressed at
Amendment 70 #
Motion for a resolution Paragraph 1 a (new) 1 a. Welcomes the Commission's announcement of a future European Cybercrime Centre and the support provided in protecting children;
Amendment 71 #
Motion for a resolution Paragraph 1 a (new) 1 a. Welcomes the newly adopted Commission strategy to protect the rights of minors in the digital world, and calls on the Commission, Member States and the industry to continue to cooperate efficiently in order to deliver a safer internet for children;
Amendment 72 #
Motion for a resolution Paragraph 2 2.
Amendment 73 #
Motion for a resolution Paragraph 2 2. In
Amendment 74 #
Motion for a resolution Paragraph 2 2. Invites all the Member States to ratify the Council of Europe Convention on the Protection of Children against Sexual
Amendment 75 #
Motion for a resolution Paragraph 2 a (new) 2a. Believes it to be extremely important that training in media skills should begin at the earliest possible stage, educating children and adolescents to decide in a critical and informed manner which paths they wish follow in the Internet and which they wish to avoid, as well as promoting fundamental values in relation to coexistence and a respectful and tolerant attitude to other people;
Amendment 76 #
Motion for a resolution Paragraph 2 bis (new) 2a. Invites the European Commission to follow up effectively on Written Declaration No 29, approved by an absolute majority of MEPs, in which Parliament requested, among other things, the creation of a European early warning system for paedophiles and sexual offenders;
Amendment 77 #
Motion for a resolution Paragraph 2 a (new) 2 a. Calls on the Commission to cooperate and if necessary develop a memorandum of understanding with the Council of Europe to avoid duplication of work;
Amendment 78 #
Motion for a resolution Paragraph 2 b (new) 2 b. Welcomes the new cyber security agency based at Europol and calls on the Commission ensure the child protection team within the new centre is adequately resourced and cooperates effectively with Interpol;
Amendment 79 #
Motion for a resolution Paragraph 3 3.
Amendment 8 #
Motion for a resolution Recital A A. whereas the protection of minors in the digital world must be addressed at both regulatory level, by deploying more effective instruments of prevention and
Amendment 80 #
Motion for a resolution Paragraph 3 3.
Amendment 81 #
Motion for a resolution Paragraph 3 3. Hopes for the continuation of the Safer Internet programme, as well as other programmes, with adequate resources and the safeguarding of its specific character;
Amendment 82 #
Motion for a resolution Paragraph 3 3. Hopes for the continuation of the Safer Internet programme, with adequate resources and the safeguarding of its specific character and calls for a thorough review of its successes and failures in order to ensure maximum effectiveness moving forward;
Amendment 83 #
Motion for a resolution Paragraph 3 bis (new) 3a. Notes the creation, at the initiative of the Commission, of the CEO coalition for the online security of minors; calls for, in this regard, close collaboration with civil society associations and organisations working for the protection of minors, data protection, education, etc., representatives of parents and educators, including at a European level, as well as the various Commission Directorates-General that are tasked with consumer protection and justice;
Amendment 84 #
Motion for a resolution Subheading 2 Amendment 85 #
Motion for a resolution Paragraph 4 4. Identifies in ‘Media Education’ the essential tool for a
Amendment 86 #
Motion for a resolution Paragraph 4 4. Identifies in ‘Media Education’ the essential tool for access to and
Amendment 87 #
Motion for a resolution Paragraph 4 4. Identifies in
Amendment 88 #
Motion for a resolution Paragraph 4 4. Identifies in ‘Media Education’ the essential tool for access to and training of minors in the critical use of media and opportunities of the digital world and invites all the Member States to enhance Media Education in the school curriculum and to make use of the good practice of ‘European Schoolnet’; reminds the Commission that ‘Consumer Education’ is also important given the continued growth of digital marketing;
Amendment 89 #
Motion for a resolution Paragraph 4 a (new) 4a. Reminds the Commission that in many families children are better at using digital environments than their parents, which is why families should be informed of the dangers present in the digital world and instructed on how such dangers can be averted;
Amendment 9 #
Motion for a resolution Recital A A. whereas the protection of minors in the digital world must be addressed in a spirit of respect for the fundamental liberties and rights at both regulatory level, by deploying more effective instruments of prevention and
Amendment 90 #
Motion for a resolution Paragraph 5 5. Reiterates the importance of the digital literacy and skills of minors
Amendment 91 #
Motion for a resolution Paragraph 5 5. Reiterates the importance of the digital literacy and skills of minors being considered as a priority in the
Amendment 92 #
Motion for a resolution Paragraph 5 5. Reiterates the importance of the digital literacy and skills of minors being considered as a priority in the Union’s
Amendment 93 #
Motion for a resolution Paragraph 5 5. Reiterates the importance of the digital and media literacy and skills of minors being considered as a priority in the Union’s social policy and as a crucial component of the Europe 2020 Strategy;
Amendment 94 #
Motion for a resolution Paragraph 5 5. Reiterates the importance of the digital literacy and skills of minors and their parents being considered as a priority in the Union’s social policy;
Amendment 95 #
Motion for a resolution Paragraph 5 5. Reiterates the importance of the digital literacy and skills of minors being considered as a priority in the Union’s social and educational polic
Amendment 96 #
Motion for a resolution Paragraph 5 bis (new) Amendment 97 #
Motion for a resolution Paragraph 6 6. Underlines the need for a
Amendment 98 #
Motion for a resolution Paragraph 6 6. Underlines the need for an educational alliance among families, school, civil society, interested parties and audiovisual services, in order to guarantee a
Amendment 99 #
Motion for a resolution Paragraph 6 6. Underlines the need for an educational alliance among families, school, civil society
source: PE-489.363
|
History
(these mark the time of scraping, not the official date of the change)
docs/0 |
|
docs/4 |
|
docs/5 |
|
events/0 |
|
events/0 |
|
committees/0/shadows/3 |
|
docs/0 |
|
docs/0 |
|
docs/0/docs/0/url |
Old
http://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE486.198New
https://www.europarl.europa.eu/doceo/document/CULT-PR-486198_EN.html |
docs/1 |
|
docs/1 |
|
docs/1/docs/0/url |
Old
http://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE489.363New
https://www.europarl.europa.eu/doceo/document/CULT-AM-489363_EN.html |
docs/2 |
|
docs/2 |
|
docs/2/docs/0/url |
Old
http://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&mode=XML&language=EN&reference=PE491.241&secondRef=02New
https://www.europarl.europa.eu/doceo/document/LIBE-AD-491241_EN.html |
docs/3 |
|
events/0 |
|
events/0 |
|
events/1/type |
Old
Committee referral announced in Parliament, 1st reading/single readingNew
Committee referral announced in Parliament |
events/2/type |
Old
Vote in committee, 1st reading/single readingNew
Vote in committee |
events/3 |
|
events/3 |
|
events/4/docs/0/url |
Old
http://www.europarl.europa.eu/sides/getDoc.do?secondRef=TOC&language=EN&reference=20121119&type=CRENew
https://www.europarl.europa.eu/doceo/document/CRE-7-2012-11-19-TOC_EN.html |
events/6 |
|
events/6 |
|
procedure/Modified legal basis |
Rules of Procedure EP 150
|
procedure/Other legal basis |
Rules of Procedure EP 159
|
procedure/legal_basis/0 |
Rules of Procedure EP 54
|
procedure/legal_basis/0 |
Rules of Procedure EP 052
|
committees/0 |
|
committees/0 |
|
committees/3 |
|
committees/3 |
|
docs/0/docs/0/url |
Old
http://www.europarl.europa.eu/RegData/docs_autres_institutions/commission_europeenne/com/2011/0556/COM_COM(2011)0556_EN.pdfNew
http://www.europarl.europa.eu/registre/docs_autres_institutions/commission_europeenne/com/2011/0556/COM_COM(2011)0556_EN.pdf |
docs/4/body |
EC
|
events/0/docs/0/url |
Old
http://www.europarl.europa.eu/RegData/docs_autres_institutions/commission_europeenne/com/2011/0556/COM_COM(2011)0556_EN.pdfNew
http://www.europarl.europa.eu/registre/docs_autres_institutions/commission_europeenne/com/2011/0556/COM_COM(2011)0556_EN.pdf |
events/3/docs/0/url |
Old
http://www.europarl.europa.eu/sides/getDoc.do?type=REPORT&mode=XML&reference=A7-2012-353&language=ENNew
http://www.europarl.europa.eu/doceo/document/A-7-2012-0353_EN.html |
events/6/docs/0/url |
Old
http://www.europarl.europa.eu/sides/getDoc.do?type=TA&language=EN&reference=P7-TA-2012-428New
http://www.europarl.europa.eu/doceo/document/TA-7-2012-0428_EN.html |
activities |
|
commission |
|
committees/0 |
|
committees/0 |
|
committees/1 |
|
committees/1 |
|
committees/2 |
|
committees/2 |
|
committees/3 |
|
committees/3 |
|
docs |
|
events |
|
links |
|
other |
|
procedure/Modified legal basis |
Old
Rules of Procedure of the European Parliament EP 150New
Rules of Procedure EP 150 |
procedure/dossier_of_the_committee |
Old
CULT/7/09239New
|
procedure/legal_basis/0 |
Rules of Procedure EP 052
|
procedure/legal_basis/0 |
Rules of Procedure of the European Parliament EP 052
|
procedure/subject |
Old
New
|
activities/0/docs/0/celexid |
CELEX:52011DC0556:EN
|
activities/0/docs/0/celexid |
CELEX:52011DC0556:EN
|
activities/0/date |
Old
2012-11-19T00:00:00New
2011-09-13T00:00:00 |
activities/0/docs |
|
activities/0/type |
Old
Prev DG PRESNew
Non-legislative basic document published |
activities/1 |
|
activities/1/committees |
|
activities/1/date |
Old
2012-11-20T00:00:00New
2012-04-20T00:00:00 |
activities/1/docs |
|
activities/1/type |
Old
Text adopted by Parliament, single readingNew
Committee referral announced in Parliament, 1st reading/single reading |
activities/2 |
|
activities/2/committees |
|
activities/2/date |
Old
2012-05-10T00:00:00New
2012-10-09T00:00:00 |
activities/2/docs |
|
activities/2/type |
Old
Amendments tabled in committeeNew
Vote in committee, 1st reading/single reading |
activities/3 |
|
activities/3/committees |
|
activities/3/date |
Old
2012-04-20T00:00:00New
2012-10-24T00:00:00 |
activities/3/docs |
|
activities/3/type |
Old
Committee referral announced in Parliament, 1st reading/single readingNew
Committee report tabled for plenary, single reading |
activities/4/date |
Old
2011-09-13T00:00:00New
2012-11-19T00:00:00 |
activities/4/docs |
|
activities/4/type |
Old
DateNew
Debate in Parliament |
activities/5/committees |
|
activities/5/date |
Old
2012-10-09T00:00:00New
2012-11-20T00:00:00 |
activities/5/docs |
|
activities/5/type |
Old
Vote in committee, 1st reading/single readingNew
Results of vote in Parliament |
activities/7 |
|
activities/8 |
|
committees/0/rapporteur/0/mepref |
Old
4de183ff0fb8127435bdbcedNew
4f1ac747b819f25efd000079 |
committees/0/shadows/0/group |
Old
EPPNew
PPE |
committees/0/shadows/0/mepref |
Old
4de189830fb8127435bdc4b9New
4f1adcc2b819f207b3000137 |
committees/0/shadows/1/mepref |
Old
4de1889d0fb8127435bdc384New
4f1adb1bb819f207b30000a9 |
committees/0/shadows/2/mepref |
Old
4de183780fb8127435bdbc21New
4f1ac649b819f25efd00002f |
committees/0/shadows/3/mepref |
Old
4de1893e0fb8127435bdc45bNew
4f1adc57b819f207b3000112 |
committees/3/rapporteur/0/mepref |
Old
4de185620fb8127435bdbee9New
4f1ac900b819f25efd000105 |
procedure/Modified legal basis |
Rules of Procedure of the European Parliament EP 150
|
procedure/legal_basis/0 |
Old
Rules of Procedure of the European Parliament EP 048New
Rules of Procedure of the European Parliament EP 052 |
activities/8/type |
Old
Debate scheduledNew
Debate in Parliament |
activities/10/docs |
|
activities/10/type |
Old
Vote scheduledNew
Text adopted by Parliament, single reading |
procedure/stage_reached |
Old
Awaiting Parliament 1st reading / single reading / budget 1st stageNew
Procedure completed |
activities/7/docs/0/text |
|
activities/1/docs/0/url |
Old
http://eur-lex.europa.eu/smartapi/cgi/sga_doc?smartapi!celexplus!prod!DocNumber&lg=EN&type_doc=COMfinal&an_doc=2011&nu_doc=556New
http://www.europarl.europa.eu/registre/docs_autres_institutions/commission_europeenne/com/2011/0556/COM_COM(2011)0556_EN.pdf |
activities/7/docs/0/url |
http://www.europarl.europa.eu/sides/getDoc.do?type=REPORT&mode=XML&reference=A7-2012-353&language=EN
|
activities/5/date |
Old
2012-11-22T00:00:00New
2012-05-10T00:00:00 |
activities/5/docs |
|
activities/5/type |
Old
EP 1R PlenaryNew
Amendments tabled in committee |
activities/9/body |
Old
EPNew
EC |
activities/9/commission |
|
activities/9/date |
Old
2012-05-10T00:00:00New
2012-11-19T00:00:00 |
activities/9/docs |
|
activities/9/type |
Old
Amendments tabled in committeeNew
Prev DG PRES |
activities/11 |
|
activities/8 |
|
activities/9 |
|
activities/10/type |
Old
Indicative plenary sitting date, 1st reading/single readingNew
EP 1R Plenary |
activities/11 |
|
activities/7 |
|
activities/2 |
|
activities/2/date |
Old
2012-11-19T00:00:00New
2012-04-02T00:00:00 |
activities/2/docs |
|
activities/2/type |
Old
EP 1R PlenaryNew
Committee draft report |
activities/8 |
|
activities/7/type |
Old
Indicative plenary sitting date, 1st reading/single readingNew
EP 1R Plenary |
activities/8 |
|
activities/9 |
|
activities/1/docs/0/text/0 |
Old
The Commission presents report on the application of Council Recommendation 98/560/EC concerning the protection of minors and human dignity and of Recommendation 2006/952/EC of the European Parliament and of the Council on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry. In accordance with the requirements of the 2006 Recommendation, the report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. The findings may be summarised as follows: Tackling illegal or harmful content: content and service providers are increasingly making efforts to tackle discriminating and other illegal or harmful content, particularly through self-commitments / codes of conduct, which exist in 24 Member States. As far as Internet content is concerned, some of these initiatives ensure that websites may signal their compliance with a code of conduct by displaying an appropriate label. Efforts are also made to develop access to appropriate content for minors, for instance through specific websites for children and specific search engines While there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. Reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Hotlines: the widespread establishment and networking of hotlines is encouraging, but not sufficient. In order to foster both their efficiency and more consistency amongst Member States (e.g. best practices of interactions with law enforcement authorities), ways to make them more easily accessible and to improve their functioning and develop synergies with other related services (e.g. Helplines and Awareness Centres, 116 000/116 111 numbers) should be reflected on. Internet Service Providers (ISPs): ISPs are increasingly involved in the protection of minors, despite their limited liability and responsibility under the E-Commerce-Directive (Directive 2000/31/EC).This applies to their legal obligations regarding illegal content, but particularly to joint voluntary commitments and adherence to codes of conduct. However, ISP associations generally have no specific mandate regarding the protection of minors. Therefore, signature and compliance with codes of conduct for the protection of minors is generally only optional for members of such associations. ISPs are encouraged to become more active in the protection of minors. The application of codes of conduct should be more widespread and closely monitored. ISP associations are encouraged to include protection of minors in their mandates and commit their members accordingly. Moreover, greater involvement of consumers and authorities in the development of codes of conduct would help to ensure that self-regulation truly responds to the rapidly evolving digital world. Social networking sites: given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient manner. Active stakeholder engagement is encouraged, in particular through further awareness-raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Problematic Internet content from other Member States / from outside the EU: enhanced cooperation and harmonised protection concerning problematic Internet content seem desirable. Although this content originates mostly outside the EU, some Member States consider such an approach to be more realistic at European level than by involving third countries. Access restrictions to content: this requires on the one hand, age rating and classifying content and on the other hand ensuring respect for these ratings and classifications. The latter task falls primarily within parents' responsibility, but technical systems - filtering, age verification systems, parental control systems - provide valued support. . This is an area of most extreme fragmentation - the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature of online content, ways to align better such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of "appropriateness" and reflecting the established approaches to the liability of the various Internet actors. Audiovisual Media Services: as regards co/self-regulation systems for the protection of minors from harmful content, on-demand audiovisual media services) are lagging behind television programmes where such systems are in place in 14 Member States, with 11 of them having a code of conduct in place. The variety of actions carried out in this field reflects the distinctions made in the Audiovisual Media Services Directive (Directive 2010/13/EU) but also the difficulty to reach consensual policy responses. Universally available technical means for offering children a selective access to content on the Internet, such as parental control tools linked to age-rated and labelled content are very diverse; the solutions developed for linear/TV broadcasting (e.g. transmission times) often seem ill-adapted to Internet and other on-demand audiovisual media services. Conclusions: the survey shows that all Member States are increasingly making efforts to respond to the challenges. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse - and in a number of cases, even diverging - actions across Europe. This is particularly true of tackling illegal and harmful content, making social networks safer places and streamlining content rating schemes. Quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers. New
PURPOSE: to analyse the implementation and effectiveness of the measures in Council Recommendations 98/560/EC and 2006/952/EC on Protection of Minors. CONTENT: the objective of the 1998 and the 2006 Recommendations on Protection of Minors was to make Member States and industry conscious of the new challenges for the protection of minors in electronic media, particularly those linked to the uptake and growing importance of online services. This report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. It also ask the question whether current policies are still suitable and adequate to ensure a high level of protection for minors throughout Europe. The report discusses Member States reports and states that as a positive general result, the survey of Member States on the various dimensions of the 1998 and 2006 Recommendations shows that all Member States are conscious of the challenges for the protection of minors online and are increasingly making efforts to respond to them. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible and responsive a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse and in a number of cases, even diverging - actions across Europe. This is in particular true in certain areas: Tackling illegal and harmful content: while there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Going forward, existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. For instance, reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Making social networks safer places: whilst social networking sites offer huge opportunities for minors, they also bear a considerable risk potential, which can be summarised by the categories "illegal content", "age-inappropriate content", "inappropriate contact" and "inappropriate conduct". One promising way to counter these risks is guidelines, addressing providers of social networking sites and/or users. Only 10 Member States referred to such guidelines, and even fewer reported that there are evaluation systems in place to assess their effectiveness. Therefore, "soft law" rules currently suffer from rather patchy implementation. Given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient and consistent manner. Active stakeholder engagement is encouraged, in particular through further awareness- raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Streamlining content rating schemes: 16 Member States and Norway responded that they have diverging age ratings and classifications for different types of media. Ten Member States and Norway consider this to be a problem. Eight Member States and Norway point out that there are measures or initiatives being considered to introduce greater consistency in this field. Altogether 15 Member States and Norway consider cross-media and/or pan-European classification systems for media content helpful and feasible. This is contradicted by nine Member States which point to the cultural differences. This is an area of most extreme fragmentation the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature e of online content, ways to better align such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of appropriateness and reflecting the established approaches to the liability of the various Internet actors. Danger of market fragmentation: the report goes on to point out that quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers who try to identify the do's and don't to protect and empower children who go online. This report and the detailed responses gathered in the survey of Member States demonstrate that further action at European level may build on the best practices of the Member States and reach economies of scale for the ICT sector that will help children to safely reap the benefits of the constantly evolving digital world. |
activities/1/type |
Old
Follow-up documentNew
Non-legislative basic document |
activities/6/committees |
|
activities/6/type |
Old
Vote scheduled in committee, 1st reading/single readingNew
Vote in committee, 1st reading/single reading |
activities/1/docs/0/text/0 |
Old
PURPOSE: to analyse the implementation and effectiveness of the measures in Council Recommendations 98/560/EC and 2006/952/EC on Protection of Minors. CONTENT: the objective of the 1998 and the 2006 Recommendations on Protection of Minors was to make Member States and industry conscious of the new challenges for the protection of minors in electronic media, particularly those linked to the uptake and growing importance of online services. This report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. It also ask the question whether current policies are still suitable and adequate to ensure a high level of protection for minors throughout Europe. The report discusses Member States reports and states that as a positive general result, the survey of Member States on the various dimensions of the 1998 and 2006 Recommendations shows that all Member States are conscious of the challenges for the protection of minors online and are increasingly making efforts to respond to them. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible and responsive a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse and in a number of cases, even diverging - actions across Europe. This is in particular true in certain areas: Tackling illegal and harmful content: while there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Going forward, existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. For instance, reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Making social networks safer places: whilst social networking sites offer huge opportunities for minors, they also bear a considerable risk potential, which can be summarised by the categories "illegal content", "age-inappropriate content", "inappropriate contact" and "inappropriate conduct". One promising way to counter these risks is guidelines, addressing providers of social networking sites and/or users. Only 10 Member States referred to such guidelines, and even fewer reported that there are evaluation systems in place to assess their effectiveness. Therefore, "soft law" rules currently suffer from rather patchy implementation. Given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient and consistent manner. Active stakeholder engagement is encouraged, in particular through further awareness- raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Streamlining content rating schemes: 16 Member States and Norway responded that they have diverging age ratings and classifications for different types of media. Ten Member States and Norway consider this to be a problem. Eight Member States and Norway point out that there are measures or initiatives being considered to introduce greater consistency in this field. Altogether 15 Member States and Norway consider cross-media and/or pan-European classification systems for media content helpful and feasible. This is contradicted by nine Member States which point to the cultural differences. This is an area of most extreme fragmentation the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature e of online content, ways to better align such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of appropriateness and reflecting the established approaches to the liability of the various Internet actors. Danger of market fragmentation: the report goes on to point out that quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers who try to identify the do's and don't to protect and empower children who go online. This report and the detailed responses gathered in the survey of Member States demonstrate that further action at European level may build on the best practices of the Member States and reach economies of scale for the ICT sector that will help children to safely reap the benefits of the constantly evolving digital world. New
The Commission presents report on the application of Council Recommendation 98/560/EC concerning the protection of minors and human dignity and of Recommendation 2006/952/EC of the European Parliament and of the Council on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry. In accordance with the requirements of the 2006 Recommendation, the report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. The findings may be summarised as follows: Tackling illegal or harmful content: content and service providers are increasingly making efforts to tackle discriminating and other illegal or harmful content, particularly through self-commitments / codes of conduct, which exist in 24 Member States. As far as Internet content is concerned, some of these initiatives ensure that websites may signal their compliance with a code of conduct by displaying an appropriate label. Efforts are also made to develop access to appropriate content for minors, for instance through specific websites for children and specific search engines While there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. Reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Hotlines: the widespread establishment and networking of hotlines is encouraging, but not sufficient. In order to foster both their efficiency and more consistency amongst Member States (e.g. best practices of interactions with law enforcement authorities), ways to make them more easily accessible and to improve their functioning and develop synergies with other related services (e.g. Helplines and Awareness Centres, 116 000/116 111 numbers) should be reflected on. Internet Service Providers (ISPs): ISPs are increasingly involved in the protection of minors, despite their limited liability and responsibility under the E-Commerce-Directive (Directive 2000/31/EC).This applies to their legal obligations regarding illegal content, but particularly to joint voluntary commitments and adherence to codes of conduct. However, ISP associations generally have no specific mandate regarding the protection of minors. Therefore, signature and compliance with codes of conduct for the protection of minors is generally only optional for members of such associations. ISPs are encouraged to become more active in the protection of minors. The application of codes of conduct should be more widespread and closely monitored. ISP associations are encouraged to include protection of minors in their mandates and commit their members accordingly. Moreover, greater involvement of consumers and authorities in the development of codes of conduct would help to ensure that self-regulation truly responds to the rapidly evolving digital world. Social networking sites: given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient manner. Active stakeholder engagement is encouraged, in particular through further awareness-raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Problematic Internet content from other Member States / from outside the EU: enhanced cooperation and harmonised protection concerning problematic Internet content seem desirable. Although this content originates mostly outside the EU, some Member States consider such an approach to be more realistic at European level than by involving third countries. Access restrictions to content: this requires on the one hand, age rating and classifying content and on the other hand ensuring respect for these ratings and classifications. The latter task falls primarily within parents' responsibility, but technical systems - filtering, age verification systems, parental control systems - provide valued support. . This is an area of most extreme fragmentation - the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature of online content, ways to align better such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of "appropriateness" and reflecting the established approaches to the liability of the various Internet actors. Audiovisual Media Services: as regards co/self-regulation systems for the protection of minors from harmful content, on-demand audiovisual media services) are lagging behind television programmes where such systems are in place in 14 Member States, with 11 of them having a code of conduct in place. The variety of actions carried out in this field reflects the distinctions made in the Audiovisual Media Services Directive (Directive 2010/13/EU) but also the difficulty to reach consensual policy responses. Universally available technical means for offering children a selective access to content on the Internet, such as parental control tools linked to age-rated and labelled content are very diverse; the solutions developed for linear/TV broadcasting (e.g. transmission times) often seem ill-adapted to Internet and other on-demand audiovisual media services. Conclusions: the survey shows that all Member States are increasingly making efforts to respond to the challenges. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse - and in a number of cases, even diverging - actions across Europe. This is particularly true of tackling illegal and harmful content, making social networks safer places and streamlining content rating schemes. Quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers. |
activities/1/type |
Old
Non-legislative basic documentNew
Follow-up document |
activities/1 |
|
activities/1/date |
Old
2012-10-25T00:00:00New
2011-09-13T00:00:00 |
activities/1/docs |
|
activities/1/type |
Old
Prev DG PRESNew
Non-legislative basic document |
activities/2 |
|
activities/2/date |
Old
2012-10-25T00:00:00New
2012-04-02T00:00:00 |
activities/2/docs |
|
activities/2/type |
Old
EP 1R PlenaryNew
Committee draft report |
activities/1/commission/0/DG/title |
Old
Information Society and MediaNew
Communications Networks, Content and Technology |
activities/1/commission/0/DG/url |
Old
http://ec.europa.eu/dgs/information_society/index_en.htmNew
http://ec.europa.eu/dgs/connect/index_en.htm |
activities/7 |
|
activities/8 |
|
activities/9/date |
Old
2012-10-25T00:00:00New
2012-11-19T00:00:00 |
other/0/dg/title |
Old
Information Society and MediaNew
Communications Networks, Content and Technology |
other/0/dg/url |
Old
http://ec.europa.eu/dgs/information_society/index_en.htmNew
http://ec.europa.eu/dgs/connect/index_en.htm |
activities/6/date |
Old
2012-09-19T00:00:00New
2012-10-09T00:00:00 |
activities/5/docs |
|
activities/5/type |
Old
Deadline AmendmentsNew
Amendments tabled in committee |
activities/7 |
|
activities/8 |
|
activities/1/docs/0/text/0 |
Old
The Commission presents report on the application of Council Recommendation 98/560/ECconcerning the protection of minors and human dignity and of Recommendation 2006/952/EC of the European Parliament and of the Council on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry. In accordance with the requirements of the 2006 Recommendation, the report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. The findings may be summarised as follows: Tackling illegal or harmful content: content and service providers are increasingly making efforts to tackle discriminating and other illegal or harmful content, particularly through self-commitments/codes of conduct, which exist in 24 Member States. As far as Internet content is concerned, some of these initiatives ensure that websites may signal their compliance with a code of conduct by displaying an appropriate label. Efforts are also made to develop access to appropriate content for minors, for instance through specific websites for children and specific search engines While there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. Reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Hotlines: the widespread establishment and networking of hotlines is encouraging, but not sufficient. In order to foster both their efficiency and more consistency amongst Member States (e.g. best practices of interactions with law enforcement authorities), ways to make them more easily accessible and to improve their functioning and develop synergies with other related services (e.g. Helplines and Awareness Centres, 116 000/116 111 numbers) should be reflected on. Internet Service Providers (ISPs): ISPs are increasingly involved in the protection of minors, despite their limited liability and responsibility under the E-Commerce-Directive (Directive 2000/31/EC).This applies to their legal obligations regarding illegal content, but particularly to joint voluntary commitments and adherence to codes of conduct. However, ISP associations generally have no specific mandate regarding the protection of minors. Therefore, signature and compliance with codes of conduct for the protection of minors is generally only optional for members of such associations. ISPs are encouraged to become more active in the protection of minors. The application of codes of conduct should be more widespread and closely monitored. ISP associations are encouraged to include protection of minors in their mandates and commit their members accordingly. Moreover, greater involvement of consumers and authorities in the development of codes of conduct would help to ensure that self-regulation truly responds to the rapidly evolving digital world. Social networking sites: given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient manner. Active stakeholder engagement is encouraged, in particular through further awareness-raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Problematic Internet content from other Member States / from outside the EU: enhanced cooperation and harmonised protection concerning problematic Internet content seem desirable. Although this content originates mostly outside the EU, some Member States consider such an approach to be more realistic at European level than by involving third countries. Access restrictions to content: this requires on the one hand, age rating and classifying content and on the other hand ensuring respect for these ratings and classifications. The latter task falls primarily within parents' responsibility, but technical systems - filtering, age verification systems, parental control systems - provide valued support. . This is an area of most extreme fragmentation - the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature of online content, ways to align better such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of "appropriateness" and reflecting the established approaches to the liability of the various Internet actors. Audiovisual Media Services: as regards co/self-regulation systems for the protection of minors from harmful content, on-demand audiovisual media services) are lagging behind television programmes where such systems are in place in 14 Member States, with 11 of them having a code of conduct in place. The variety of actions carried out in this field reflects the distinctions made in the Audiovisual Media Services Directive (Directive 2010/13/EU) but also the difficulty to reach consensual policy responses. Universally available technical means for offering children a selective access to content on the Internet, such as parental control tools linked to age-rated and labelled content are very diverse; the solutions developed for linear/TV broadcasting (e.g. transmission times) often seem ill-adapted to Internet and other on-demand audiovisual media services. Conclusions: the survey shows that all Member States are increasingly making efforts to respond to the challenges. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse - and in a number of cases, even diverging - actions across Europe. This is particularly true of tackling illegal and harmful content, making social networks safer places and streamlining content rating schemes. Quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers. New
PURPOSE: to analyse the implementation and effectiveness of the measures in Council Recommendations 98/560/EC and 2006/952/EC on Protection of Minors. CONTENT: the objective of the 1998 and the 2006 Recommendations on Protection of Minors was to make Member States and industry conscious of the new challenges for the protection of minors in electronic media, particularly those linked to the uptake and growing importance of online services. This report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. It also ask the question whether current policies are still suitable and adequate to ensure a high level of protection for minors throughout Europe. The report discusses Member States reports and states that as a positive general result, the survey of Member States on the various dimensions of the 1998 and 2006 Recommendations shows that all Member States are conscious of the challenges for the protection of minors online and are increasingly making efforts to respond to them. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible and responsive a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse and in a number of cases, even diverging - actions across Europe. This is in particular true in certain areas: Tackling illegal and harmful content: while there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Going forward, existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. For instance, reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Making social networks safer places: whilst social networking sites offer huge opportunities for minors, they also bear a considerable risk potential, which can be summarised by the categories "illegal content", "age-inappropriate content", "inappropriate contact" and "inappropriate conduct". One promising way to counter these risks is guidelines, addressing providers of social networking sites and/or users. Only 10 Member States referred to such guidelines, and even fewer reported that there are evaluation systems in place to assess their effectiveness. Therefore, "soft law" rules currently suffer from rather patchy implementation. Given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient and consistent manner. Active stakeholder engagement is encouraged, in particular through further awareness- raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Streamlining content rating schemes: 16 Member States and Norway responded that they have diverging age ratings and classifications for different types of media. Ten Member States and Norway consider this to be a problem. Eight Member States and Norway point out that there are measures or initiatives being considered to introduce greater consistency in this field. Altogether 15 Member States and Norway consider cross-media and/or pan-European classification systems for media content helpful and feasible. This is contradicted by nine Member States which point to the cultural differences. This is an area of most extreme fragmentation the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature e of online content, ways to better align such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of appropriateness and reflecting the established approaches to the liability of the various Internet actors. Danger of market fragmentation: the report goes on to point out that quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers who try to identify the do's and don't to protect and empower children who go online. This report and the detailed responses gathered in the survey of Member States demonstrate that further action at European level may build on the best practices of the Member States and reach economies of scale for the ICT sector that will help children to safely reap the benefits of the constantly evolving digital world. |
activities/1/type |
Old
Follow-up documentNew
Non-legislative basic document |
activities/7/type |
Old
Indicative plenary sitting date, 1st reading/single readingNew
EP 1R Plenary |
activities/8 |
|
activities/9 |
|
activities/1/docs/0/text/0 |
Old
PURPOSE: to analyse the implementation and effectiveness of the measures in Council Recommendations 98/560/EC and 2006/952/EC on Protection of Minors. CONTENT: the objective of the 1998 and the 2006 Recommendations on Protection of Minors was to make Member States and industry conscious of the new challenges for the protection of minors in electronic media, particularly those linked to the uptake and growing importance of online services. This report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. It also ask the question whether current policies are still suitable and adequate to ensure a high level of protection for minors throughout Europe. The report discusses Member States reports and states that as a positive general result, the survey of Member States on the various dimensions of the 1998 and 2006 Recommendations shows that all Member States are conscious of the challenges for the protection of minors online and are increasingly making efforts to respond to them. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible and responsive a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse and in a number of cases, even diverging - actions across Europe. This is in particular true in certain areas: Tackling illegal and harmful content: while there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Going forward, existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. For instance, reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Making social networks safer places: whilst social networking sites offer huge opportunities for minors, they also bear a considerable risk potential, which can be summarised by the categories "illegal content", "age-inappropriate content", "inappropriate contact" and "inappropriate conduct". One promising way to counter these risks is guidelines, addressing providers of social networking sites and/or users. Only 10 Member States referred to such guidelines, and even fewer reported that there are evaluation systems in place to assess their effectiveness. Therefore, "soft law" rules currently suffer from rather patchy implementation. Given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient and consistent manner. Active stakeholder engagement is encouraged, in particular through further awareness- raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Streamlining content rating schemes: 16 Member States and Norway responded that they have diverging age ratings and classifications for different types of media. Ten Member States and Norway consider this to be a problem. Eight Member States and Norway point out that there are measures or initiatives being considered to introduce greater consistency in this field. Altogether 15 Member States and Norway consider cross-media and/or pan-European classification systems for media content helpful and feasible. This is contradicted by nine Member States which point to the cultural differences. This is an area of most extreme fragmentation the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature e of online content, ways to better align such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of appropriateness and reflecting the established approaches to the liability of the various Internet actors. Danger of market fragmentation: the report goes on to point out that quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers who try to identify the do's and don't to protect and empower children who go online. This report and the detailed responses gathered in the survey of Member States demonstrate that further action at European level may build on the best practices of the Member States and reach economies of scale for the ICT sector that will help children to safely reap the benefits of the constantly evolving digital world. New
The Commission presents report on the application of Council Recommendation 98/560/ECconcerning the protection of minors and human dignity and of Recommendation 2006/952/EC of the European Parliament and of the Council on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry. In accordance with the requirements of the 2006 Recommendation, the report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. The findings may be summarised as follows: Tackling illegal or harmful content: content and service providers are increasingly making efforts to tackle discriminating and other illegal or harmful content, particularly through self-commitments/codes of conduct, which exist in 24 Member States. As far as Internet content is concerned, some of these initiatives ensure that websites may signal their compliance with a code of conduct by displaying an appropriate label. Efforts are also made to develop access to appropriate content for minors, for instance through specific websites for children and specific search engines While there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. Reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Hotlines: the widespread establishment and networking of hotlines is encouraging, but not sufficient. In order to foster both their efficiency and more consistency amongst Member States (e.g. best practices of interactions with law enforcement authorities), ways to make them more easily accessible and to improve their functioning and develop synergies with other related services (e.g. Helplines and Awareness Centres, 116 000/116 111 numbers) should be reflected on. Internet Service Providers (ISPs): ISPs are increasingly involved in the protection of minors, despite their limited liability and responsibility under the E-Commerce-Directive (Directive 2000/31/EC).This applies to their legal obligations regarding illegal content, but particularly to joint voluntary commitments and adherence to codes of conduct. However, ISP associations generally have no specific mandate regarding the protection of minors. Therefore, signature and compliance with codes of conduct for the protection of minors is generally only optional for members of such associations. ISPs are encouraged to become more active in the protection of minors. The application of codes of conduct should be more widespread and closely monitored. ISP associations are encouraged to include protection of minors in their mandates and commit their members accordingly. Moreover, greater involvement of consumers and authorities in the development of codes of conduct would help to ensure that self-regulation truly responds to the rapidly evolving digital world. Social networking sites: given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient manner. Active stakeholder engagement is encouraged, in particular through further awareness-raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Problematic Internet content from other Member States / from outside the EU: enhanced cooperation and harmonised protection concerning problematic Internet content seem desirable. Although this content originates mostly outside the EU, some Member States consider such an approach to be more realistic at European level than by involving third countries. Access restrictions to content: this requires on the one hand, age rating and classifying content and on the other hand ensuring respect for these ratings and classifications. The latter task falls primarily within parents' responsibility, but technical systems - filtering, age verification systems, parental control systems - provide valued support. . This is an area of most extreme fragmentation - the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature of online content, ways to align better such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of "appropriateness" and reflecting the established approaches to the liability of the various Internet actors. Audiovisual Media Services: as regards co/self-regulation systems for the protection of minors from harmful content, on-demand audiovisual media services) are lagging behind television programmes where such systems are in place in 14 Member States, with 11 of them having a code of conduct in place. The variety of actions carried out in this field reflects the distinctions made in the Audiovisual Media Services Directive (Directive 2010/13/EU) but also the difficulty to reach consensual policy responses. Universally available technical means for offering children a selective access to content on the Internet, such as parental control tools linked to age-rated and labelled content are very diverse; the solutions developed for linear/TV broadcasting (e.g. transmission times) often seem ill-adapted to Internet and other on-demand audiovisual media services. Conclusions: the survey shows that all Member States are increasingly making efforts to respond to the challenges. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse - and in a number of cases, even diverging - actions across Europe. This is particularly true of tackling illegal and harmful content, making social networks safer places and streamlining content rating schemes. Quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers. |
activities/1/type |
Old
Non-legislative basic documentNew
Follow-up document |
procedure/legal_basis |
|
activities/1/docs/0/text/0 |
Old
The Commission presents report on the application of Council Recommendation 98/560/ECconcerning the protection of minors and human dignity and of Recommendation 2006/952/EC of the European Parliament and of the Council on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry. In accordance with the requirements of the 2006 Recommendation, the report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. The findings may be summarised as follows: Tackling illegal or harmful content: content and service providers are increasingly making efforts to tackle discriminating and other illegal or harmful content, particularly through self-commitments/codes of conduct, which exist in 24 Member States. As far as Internet content is concerned, some of these initiatives ensure that websites may signal their compliance with a code of conduct by displaying an appropriate label. Efforts are also made to develop access to appropriate content for minors, for instance through specific websites for children and specific search engines While there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. Reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Hotlines: the widespread establishment and networking of hotlines is encouraging, but not sufficient. In order to foster both their efficiency and more consistency amongst Member States (e.g. best practices of interactions with law enforcement authorities), ways to make them more easily accessible and to improve their functioning and develop synergies with other related services (e.g. Helplines and Awareness Centres, 116 000/116 111 numbers) should be reflected on. Internet Service Providers (ISPs): ISPs are increasingly involved in the protection of minors, despite their limited liability and responsibility under the E-Commerce-Directive (Directive 2000/31/EC).This applies to their legal obligations regarding illegal content, but particularly to joint voluntary commitments and adherence to codes of conduct. However, ISP associations generally have no specific mandate regarding the protection of minors. Therefore, signature and compliance with codes of conduct for the protection of minors is generally only optional for members of such associations. ISPs are encouraged to become more active in the protection of minors. The application of codes of conduct should be more widespread and closely monitored. ISP associations are encouraged to include protection of minors in their mandates and commit their members accordingly. Moreover, greater involvement of consumers and authorities in the development of codes of conduct would help to ensure that self-regulation truly responds to the rapidly evolving digital world. Social networking sites: given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient manner. Active stakeholder engagement is encouraged, in particular through further awareness-raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Problematic Internet content from other Member States / from outside the EU: enhanced cooperation and harmonised protection concerning problematic Internet content seem desirable. Although this content originates mostly outside the EU, some Member States consider such an approach to be more realistic at European level than by involving third countries. Access restrictions to content: this requires on the one hand, age rating and classifying content and on the other hand ensuring respect for these ratings and classifications. The latter task falls primarily within parents' responsibility, but technical systems - filtering, age verification systems, parental control systems - provide valued support. . This is an area of most extreme fragmentation - the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature of online content, ways to align better such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of "appropriateness" and reflecting the established approaches to the liability of the various Internet actors. Audiovisual Media Services: as regards co/self-regulation systems for the protection of minors from harmful content, on-demand audiovisual media services) are lagging behind television programmes where such systems are in place in 14 Member States, with 11 of them having a code of conduct in place. The variety of actions carried out in this field reflects the distinctions made in the Audiovisual Media Services Directive (Directive 2010/13/EU) but also the difficulty to reach consensual policy responses. Universally available technical means for offering children a selective access to content on the Internet, such as parental control tools linked to age-rated and labelled content are very diverse; the solutions developed for linear/TV broadcasting (e.g. transmission times) often seem ill-adapted to Internet and other on-demand audiovisual media services. Conclusions: the survey shows that all Member States are increasingly making efforts to respond to the challenges. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse - and in a number of cases, even diverging - actions across Europe. This is particularly true of tackling illegal and harmful content, making social networks safer places and streamlining content rating schemes. Quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers. New
PURPOSE: to analyse the implementation and effectiveness of the measures in Council Recommendations 98/560/EC and 2006/952/EC on Protection of Minors. CONTENT: the objective of the 1998 and the 2006 Recommendations on Protection of Minors was to make Member States and industry conscious of the new challenges for the protection of minors in electronic media, particularly those linked to the uptake and growing importance of online services. This report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. It also ask the question whether current policies are still suitable and adequate to ensure a high level of protection for minors throughout Europe. The report discusses Member States reports and states that as a positive general result, the survey of Member States on the various dimensions of the 1998 and 2006 Recommendations shows that all Member States are conscious of the challenges for the protection of minors online and are increasingly making efforts to respond to them. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible and responsive a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse and in a number of cases, even diverging - actions across Europe. This is in particular true in certain areas: Tackling illegal and harmful content: while there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Going forward, existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. For instance, reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Making social networks safer places: whilst social networking sites offer huge opportunities for minors, they also bear a considerable risk potential, which can be summarised by the categories "illegal content", "age-inappropriate content", "inappropriate contact" and "inappropriate conduct". One promising way to counter these risks is guidelines, addressing providers of social networking sites and/or users. Only 10 Member States referred to such guidelines, and even fewer reported that there are evaluation systems in place to assess their effectiveness. Therefore, "soft law" rules currently suffer from rather patchy implementation. Given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient and consistent manner. Active stakeholder engagement is encouraged, in particular through further awareness- raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Streamlining content rating schemes: 16 Member States and Norway responded that they have diverging age ratings and classifications for different types of media. Ten Member States and Norway consider this to be a problem. Eight Member States and Norway point out that there are measures or initiatives being considered to introduce greater consistency in this field. Altogether 15 Member States and Norway consider cross-media and/or pan-European classification systems for media content helpful and feasible. This is contradicted by nine Member States which point to the cultural differences. This is an area of most extreme fragmentation the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature e of online content, ways to better align such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of appropriateness and reflecting the established approaches to the liability of the various Internet actors. Danger of market fragmentation: the report goes on to point out that quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers who try to identify the do's and don't to protect and empower children who go online. This report and the detailed responses gathered in the survey of Member States demonstrate that further action at European level may build on the best practices of the Member States and reach economies of scale for the ICT sector that will help children to safely reap the benefits of the constantly evolving digital world. |
activities/1/type |
Old
Follow-up documentNew
Non-legislative basic document |
activities/1/docs/0/text/0 |
Old
PURPOSE: to analyse the implementation and effectiveness of the measures in Council Recommendations 98/560/EC and 2006/952/EC on Protection of Minors. CONTENT: the objective of the 1998 and the 2006 Recommendations on Protection of Minors was to make Member States and industry conscious of the new challenges for the protection of minors in electronic media, particularly those linked to the uptake and growing importance of online services. This report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. It also ask the question whether current policies are still suitable and adequate to ensure a high level of protection for minors throughout Europe. The report discusses Member States reports and states that as a positive general result, the survey of Member States on the various dimensions of the 1998 and 2006 Recommendations shows that all Member States are conscious of the challenges for the protection of minors online and are increasingly making efforts to respond to them. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible and responsive a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse and in a number of cases, even diverging - actions across Europe. This is in particular true in certain areas: Tackling illegal and harmful content: while there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Going forward, existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. For instance, reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Making social networks safer places: whilst social networking sites offer huge opportunities for minors, they also bear a considerable risk potential, which can be summarised by the categories "illegal content", "age-inappropriate content", "inappropriate contact" and "inappropriate conduct". One promising way to counter these risks is guidelines, addressing providers of social networking sites and/or users. Only 10 Member States referred to such guidelines, and even fewer reported that there are evaluation systems in place to assess their effectiveness. Therefore, "soft law" rules currently suffer from rather patchy implementation. Given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient and consistent manner. Active stakeholder engagement is encouraged, in particular through further awareness- raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Streamlining content rating schemes: 16 Member States and Norway responded that they have diverging age ratings and classifications for different types of media. Ten Member States and Norway consider this to be a problem. Eight Member States and Norway point out that there are measures or initiatives being considered to introduce greater consistency in this field. Altogether 15 Member States and Norway consider cross-media and/or pan-European classification systems for media content helpful and feasible. This is contradicted by nine Member States which point to the cultural differences. This is an area of most extreme fragmentation the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature e of online content, ways to better align such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of appropriateness and reflecting the established approaches to the liability of the various Internet actors. Danger of market fragmentation: the report goes on to point out that quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers who try to identify the do's and don't to protect and empower children who go online. This report and the detailed responses gathered in the survey of Member States demonstrate that further action at European level may build on the best practices of the Member States and reach economies of scale for the ICT sector that will help children to safely reap the benefits of the constantly evolving digital world. New
The Commission presents report on the application of Council Recommendation 98/560/ECconcerning the protection of minors and human dignity and of Recommendation 2006/952/EC of the European Parliament and of the Council on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry. In accordance with the requirements of the 2006 Recommendation, the report analyses the implementation and effectiveness of the measures specified in the 1998 and 2006 Recommendations in Member States. The findings may be summarised as follows: Tackling illegal or harmful content: content and service providers are increasingly making efforts to tackle discriminating and other illegal or harmful content, particularly through self-commitments/codes of conduct, which exist in 24 Member States. As far as Internet content is concerned, some of these initiatives ensure that websites may signal their compliance with a code of conduct by displaying an appropriate label. Efforts are also made to develop access to appropriate content for minors, for instance through specific websites for children and specific search engines While there is convergence in the Member States that promoting self-regulatory measures (codes of conduct) is useful, there is persistent concern that the protection levels achieved in this field still differ significantly. Existing measures against illegal or harmful contents should be constantly monitored in order to ensure their effectiveness. Reporting points for this type of content, provided by the content provider and to be used by children and parents are being developed and supported by functioning back office infrastructures, but all these initiatives lack common features and economies of scale that would increase their efficiency. Hotlines: the widespread establishment and networking of hotlines is encouraging, but not sufficient. In order to foster both their efficiency and more consistency amongst Member States (e.g. best practices of interactions with law enforcement authorities), ways to make them more easily accessible and to improve their functioning and develop synergies with other related services (e.g. Helplines and Awareness Centres, 116 000/116 111 numbers) should be reflected on. Internet Service Providers (ISPs): ISPs are increasingly involved in the protection of minors, despite their limited liability and responsibility under the E-Commerce-Directive (Directive 2000/31/EC).This applies to their legal obligations regarding illegal content, but particularly to joint voluntary commitments and adherence to codes of conduct. However, ISP associations generally have no specific mandate regarding the protection of minors. Therefore, signature and compliance with codes of conduct for the protection of minors is generally only optional for members of such associations. ISPs are encouraged to become more active in the protection of minors. The application of codes of conduct should be more widespread and closely monitored. ISP associations are encouraged to include protection of minors in their mandates and commit their members accordingly. Moreover, greater involvement of consumers and authorities in the development of codes of conduct would help to ensure that self-regulation truly responds to the rapidly evolving digital world. Social networking sites: given the massive expansion of social networking sites, operators' control systems fall short of covering all the potential risks in an efficient manner. Active stakeholder engagement is encouraged, in particular through further awareness-raising as regards the risks and ways to mitigate them, wider use of guidelines, with implementation monitoring. In addition, reporting points with a well functioning back office infrastructure are increasingly being deployed on social networks in order to assist children in dealing with grooming, cyber-bullying and similar issues, but the solutions are being developed on a case-by case basis. Moreover, the use of "privacy by default" settings for children joining in social networking sites is not widespread. Problematic Internet content from other Member States / from outside the EU: enhanced cooperation and harmonised protection concerning problematic Internet content seem desirable. Although this content originates mostly outside the EU, some Member States consider such an approach to be more realistic at European level than by involving third countries. Access restrictions to content: this requires on the one hand, age rating and classifying content and on the other hand ensuring respect for these ratings and classifications. The latter task falls primarily within parents' responsibility, but technical systems - filtering, age verification systems, parental control systems - provide valued support. . This is an area of most extreme fragmentation - the conceptions of what is necessary and useful diverge significantly between and within Member States. While most Member States see scope for improving their age rating and classification systems, there is clearly no consensus on the helpfulness and feasibility of cross-media and/or pan-European classification systems for media content. Still, in view of the increasingly borderless nature of online content, ways to align better such systems should be explored further. Internet-enabled devices with parental control tools are increasingly available but the articulation with the use of appropriate content relies upon case-by case solutions that vary greatly between and within Member States. Against this background, it seems worth reflecting upon innovative rating and content classifications systems that could be used more widely across the ICT sector (manufacturers, host and content providers, etc.), while leaving the necessary flexibility for local interpretations of "appropriateness" and reflecting the established approaches to the liability of the various Internet actors. Audiovisual Media Services: as regards co/self-regulation systems for the protection of minors from harmful content, on-demand audiovisual media services) are lagging behind television programmes where such systems are in place in 14 Member States, with 11 of them having a code of conduct in place. The variety of actions carried out in this field reflects the distinctions made in the Audiovisual Media Services Directive (Directive 2010/13/EU) but also the difficulty to reach consensual policy responses. Universally available technical means for offering children a selective access to content on the Internet, such as parental control tools linked to age-rated and labelled content are very diverse; the solutions developed for linear/TV broadcasting (e.g. transmission times) often seem ill-adapted to Internet and other on-demand audiovisual media services. Conclusions: the survey shows that all Member States are increasingly making efforts to respond to the challenges. A policy mix, with a significant component of self-regulatory measures, seems best suited to address in as flexible a way as possible the convergence between platforms (TV, PC, smartphones, consoles, etc.) and audiovisual content. However, the detailed assessment of the policy responses that Member States have developed presents a landscape made of very diverse - and in a number of cases, even diverging - actions across Europe. This is particularly true of tackling illegal and harmful content, making social networks safer places and streamlining content rating schemes. Quite often, the regulatory or self-regulatory measures also lack ambition and consistency with similar measures put in place in other Member States, or they are simply not effectively implemented in practice. A patchwork of measures across Europe can only lead to internal market fragmentation and to confusion for parents and teachers. |
activities/1/type |
Old
Non-legislative basic documentNew
Follow-up document |
activities/1 |
|
activities/1/date |
Old
2012-09-10T00:00:00New
2011-09-13T00:00:00 |
activities/1/docs |
|
activities/1/type |
Old
Prev DG PRESNew
Non-legislative basic document |
activities/2 |
|
activities/2/date |
Old
2012-07-09T00:00:00New
2012-04-02T00:00:00 |
activities/2/docs |
|
activities/2/type |
Old
Prev Adopt in CteNew
Committee draft report |
activities/7 |
|
activities/7 |
|
activities/8 |
|
activities/9 |
|
activities/10/date |
Old
2012-09-10T00:00:00New
2012-10-23T00:00:00 |
activities/4/committees/3/date |
2012-06-11T00:00:00
|
activities/4/committees/3/rapporteur |
|
committees/3/date |
2012-06-11T00:00:00
|
committees/3/rapporteur |
|
activities/1/docs/0/text |
|
activities/6/date |
Old
2012-06-19T00:00:00New
2012-07-09T00:00:00 |
activities/6/date |
Old
2012-07-10T00:00:00New
2012-06-19T00:00:00 |
activities/6/type |
Old
Vote scheduled in committee, 1st reading/single readingNew
Prev Adopt in Cte |
activities/1 |
|
activities/1/body |
Old
EPNew
EC |
activities/1/commission |
|
activities/1/date |
Old
2012-06-19T00:00:00New
2011-09-13T00:00:00 |
activities/1/docs |
|
activities/1/type |
Old
EP 1R CommitteeNew
Non-legislative basic document |
activities/6/type |
Old
Vote scheduled in committee, 1st reading/single readingNew
EP 1R Committee |
activities/7 |
|
procedure/legal_basis |
|
activities |
|
committees |
|
links |
|
other |
|
procedure |
|