Ministers’ Deputies
CM Documents

CM(2008)37 add 26 February 20081
——————————————

1022 Meeting, 26 March 2008
5 Media


5.1 Steering Committee on the Media and New Communication Services (CDMC)

a. Draft Recommendation CM/Rec(2008)… of the Committee of Ministers to member states on measures to promote the respect for freedom of expression and information with regard to Internet filters – Report by the Group of Specialists on human rights in the information society (MC-S-IS) on the use and impact of technical filtering measures for various types of content in the online environment2

For consideration by the GR-H at its meeting of 18 March 2008

——————————————

I. Introduction

Background and context

1. In 2005, the participating ministers in the 7th European Ministerial Conference on Mass Media Policy adopted inter alia a Resolution on human rights in the information society in which they reaffirmed “their commitmentto remove, when technically feasible, any hindrances to the free flow of information through new communication services” and at the same time agreed to “Undertake, mindful of the importance of protecting minors, to step up their efforts and co-operation to minimise the risks for them of the dissemination of harmful content on the new communication services”.3

2. At the 985th meeting of the Ministers’ Deputies on 31 January 2007, the Committee of Ministers took note of the terms of reference of the Group of Specialists on Human Rights in the Information Society (MC-S-IS) according to which the Group should “Report on the use and impact of technical filtering measures for various types of content in the online environment, with particular regard to Article 10 of the European Convention on Human Rights and, if appropriate, make concrete proposals (e.g. in the form of a draft standard-setting instrument) for further Action in this area”.

3. This report has been prepared in order to examine and better understand the use and impact of technical filtering measures for various types of content in the online environment. Particular regard is paid to the impact such filtering measures may have on the right to freedom of expression and to receive and impart information and ideas in accordance with Article 10 of the European Convention on Human Rights.

4. This report provides a basis on which the MC-S-IS could, if it deems it appropriate, propose and develop concrete actions, for example in developing best practice guidelines that promote freedom of expression and information with regard to technical filtering measures.

5. It should be noted from the outset that the report’s aim is not to provide definitions for what kind of content should be classified as illegal and/or harmful under the national laws of Council of Europe member states. The purpose of it is to develop a basis for the formulation of principles and guidelines to ensure that filtering measures are designed and applied in a way that respects freedom of expression and information.

Understanding technical filtering measures

6. At its most basic, a technical filtering measure can be defined as a technical action that limits the accessibility of Internet content. As discussed more fully below, technical filtering measures are computer (software) applications that work by blocking or filtering (categories) of illegal, harmful or otherwise unwanted content that a web browser or other Internet application is capable of displaying or downloading.

7. Taking this definition as a starting point, one can distinguish between two categories of technical filtering measures: filtering measures resulting from state interventions and filtering products offered by private actors.

Filtering measures resulting from state interventions

The first category covers measures that are established as a result of interventions by state authorities, such as when access to a specific domain or a server containing illegal material is blocked by an Internet Service Provider (ISP) following a court decision. State interventions leading to restrictions on availability of Internet content often follow from the prior characterization of the blocked or filtered content as illegal (such as in the case of child pornography)4 or unlawfully distributed (such as in the case of distribution of copyrighted content).5

Filtering products offered by private actors

The second category covers various filtering products which may be installed on the computer of an individual user or, for example, on a corporate network. Such products produced by private actors are used by both private and public actors (parents, schools, universities, libraries, enterprises etc.) to pre-select or block access to certain categories of content. In such cases the filtering or blocking of content is not the result of administrative proceedings or a court decision on the illegality of the content but of a decision taken by a private or public actor based on the need to protect against or restrict the access to certain content. Compared with state interventions mentioned above, filtering products are not only used to block or filter illegal content but can restrict the user’s access to legal albeit potentially harmful content.

8. Any intervention by the state to block or filter access to a specific website could constitute a restriction on freedom of expression and access to information in the online environment that would have to fulfil the conditions of Article 10, paragraph 2, of the European Convention on Human Rights.6 Therefore, the technical filtering measures mentioned above must be prescribed by law, pursue one of the legitimate aims mentioned in Article 10, paragraph 2, and be necessary in a democratic society.

9. Using filtering products offered by private actors may restrict both the ability of authors of content and of users to impart and receive information and ideas via websites, blogs, newsgroups etc. Such restrictions, to the extent that they are the result of action by non-state actors, are not directly subject to the requirements of Article 10, paragraph 2, of the European Convention on Human Rights. However, it should be noted that, according to the case law of the European Court of Human Rights, the state, as the “ultimate guarantor of pluralism”, does have a positive obligation to guarantee the enjoyment of the rights contained in Article 10 of the European Convention on Human Rights.7

10. Given the differences between the two categories of measures and products and that the human rights framework for technical filtering measures in the form of state interventions against illegal content is more straightforward, the main focus of the present report will be the use and impact of such filtering products offered by private actors.

Search engines

11. Apart from filtering products, it could with some justification be claimed that search engines should be included within the group of technical filtering measures offered by private actors. Most Internet users today rely on search engines to rapidly find the most relevant information for their needs. In this way, search engines play an important role as gatekeepers in determining and shaping which information is received through the Internet. This issue is accentuated when, as is the case today, there is one dominant search engine provider on the market. The way such a search engine lists and prioritizes search results may in practice have an important impact on what information a user receives.

12. On the other hand, a website which is not listed at all, or only listed on one of the last pages in the search results produced by a search engine, still would be accessible for the user through other search engines or directly without using a search engine. The filtering effects of search engines can therefore be said to be more indirect than for filtering products.

13. Moreover, at least on the face of it, the aims of search engines are different from filtering products: whereas filtering products aim to protect users against illegal, harmful or otherwise unwanted content, search engines aim to provide the user with rapid access to information that is most likely to fulfil the user’s information needs as expressed in the search terms employed.

14. Finally, the way search engines function differs from filtering products in a number of ways and is quite a complex phenomenon which could warrant a separate inquiry.

15. Given their peculiarities, search engines have not been further explored in this report. However, the findings in the report apply to search engine providers to the extent that they are users of filtering software.8 At the same time, it should be noted that it is likely that at least some of the best practice guidelines identified in this report could form the basis of similar guidelines for search engines.

II. Use of technical filtering measures

16. The following part of the report contains a description of the use and practical functioning of technical filtering measures focusing on the type of content filtered, the filtering method used, the Internet applications filtered and the places and institutions where such filters are currently in use.

Types of content filtered

17. Filtering products limit access to Internet content deemed undesirable in a given situation. The basic aim of a filtering product is to block a specific type of content either to protect an user, for instance a child, against inadvertently being exposed to harmful content or to hinder an user deliberately gaining access to illegal, harmful or otherwise objectionable content. The type of illegal or harmful content that may be subject to filtering is not identical in all Council of Europe member states and varies according to their national laws and social and cultural traditions.

18. The types of illegal, harmful or objectionable content that filtering products aim to block vary. They usually include sexually explicit content, graphically violent content, content advocating hate and content advocating illegal or harmful activity (such as drug-use, bomb-making, underage drinking and gambling, violence, software piracy, self-harm etc.).9

19. Filtering products can also be used to block access to unlawfully disseminated copyright-protected content. An example of this is the video filtering technology developed by Audible Magic and used by social networking site MySpace to block unauthorized copyright content from being posted on its website.

20. In addition, filtering products may be used to restrict access to leisure-related content that, for example, is unlikely to be related to a student’s studies or an employee’s job function.

Filtering method employed

21. Filtering products often combine several filtering and blocking methods which can be divided into six main categories:

    - URL-based filtering,
    - IP address-based filtering,
    - Protocol-based filtering,
    - Key-word blocking,
    - Filtering on the basis of labelling or rating by the content author or a third party,
    - Intelligent analysis of content (sentences, pictures).

22. With URL-based filtering the website requested by the user is matched with a pre-defined list of URLs of websites in the filtering product. One possibility is that a filter contains a “black” or “exclusion” list which contains the URLs of websites that have objectionable content. If a web page matches one of the sites on this list, the user is denied access to view the site. The opposite applies to “white” or “inclusion” lists which contain the URLs of the websites that the user is allowed to visit. The list of URLs is pre-defined by the company producing the filtering product or by a third party. While the list can in some cases be supplemented and edited by the user, the transparency of the filtering process using lists of URLs can be incomplete, in particular if the task of updating these lists is a commercial activity. URL-based filtering requires the constant updating of the list of URLs.

23. IP-based filtering works and faces similar issue as URL-based filtering with a given IP address used instead of an URL. It is mainly used to block access to black lists of IP-addresses. IP-based filtering may result in unintended “over-blocking” due to IP sharing, since a given unique IP address may correspond to the URLs of many different websites hosted on the same server.

24. Protocol-based filtering products prevent the use of computer applications and software based on technical protocols, independently of the content that may be accessed or carried. They may be used to prevent for example the use of peer to peer or instant messaging applications.

25. Keyword blocking is a method whereby the text of websites and other Internet content is compared against a list of undesirable words. If an undesirable word is identified, the filter will either remove the word or block the offending page altogether. The problem with keyword blocking is that it often leads to “overprotection”. Critics of filtering products often point out this particular problem. Surveys of filtering products often cite many examples of harmless sites being blocked by filtering products because they contain a suspicious keyword.10

26. Another approach used by filtering products is to block individual web sites on the basis of a rating of content in categories (as it is done offline for videos for example). Typically this starts with the use of automated web crawlers to search for suspicious pages. Afterwards, each page is looked at by an employee and rated accordingly. A similar approach to ratings is filtering based on labelling when the content of a webpage is given a “quality-label” (often by the author) and thus allows the blocking of webpages without that label. Among them, the European Union, via the Safer Internet Plus programme, supports an international self-labelling system which is managed by ICRA (part of the Family Online Safety Institute,11 formerly known as the Internet Content Rating Association). The system is built on existing work on content-labelling pioneered by the Recreational Software Advisory Council (RSACi) and the results of earlier preparatory consultations undertaken with EU support through the INCORE study.12 The critics of these methods relate on the one hand to the subjectivity of labelling and rating processes based on a given culture and specific moral values, and on the other hand to self-rating which can result in non-rated websites being blocked by the filtering product used.

27. Other filtering products may be less mechanical and incorporate an intelligent analysis of content, be it in the form of text or images. For text-based content, this is a method which is more sophisticated than keyword blocking in that the mere existence of a keyword is insufficient to block the website concerned. The filtering product will instead try to make a semantic analysis of the keyword concerned taking into consideration the context in which the suspicious keyword is used before blocking the website. For image-based content, the filtering product will analyse images contained on a web page and will, for example, block access to the site if it contains a large percentage of flesh coloured images. It seems that only very few filtering products currently on the market employ intelligent analysis.13

Application-environment and level of implementation

28. Most filtering products support various applications and programmes used to receive content on the Internet. A filtering product will almost certainly be able to filter Internet content on websites accessed through a web-browser such as Internet Explorer or Firefox. Moreover, more advanced filtering products will also provide support for filtering of content accessed through Internet applications for e-mailing, instant messaging, chat rooms, newsgroups etc.

29. Filtering can be implemented at different levels:14

- Firstly, state-directed implementation of national content filtering schemes and blocking technologies may be carried out at the backbone level, affecting Internet access throughout an entire country. This type of filtering falls into the first category mentioned above (filtering measures resulting from state interventions) and clearly constitutes a restriction on freedom of expression and access to information in the online environment that would have to fulfil the conditions in Article 10, paragraph 2, of the European Convention on Human Rights;15

- Secondly, filtering may be implemented at the level of an Internet Service Provider. In such cases, all Internet traffic to the customer concerned will have to go through a filter at the level of the ISP before the user can access the content in question. Such a solution can be implemented as a mandatory requirement, as is the case in some Council of Europe member states that require the installation of filters at the ISP-level to block access to child pornography.16 In such cases, the filtering constitutes a state-imposed restriction on freedom of expression and access to information in the online environment that must fulfil the conditions of Article 10, paragraph 2, of the European Convention on Human Rights. Filtering at ISP level can also be a voluntary solution whereby the ISP gives its customers the opportunity to opt in or out of a filtering solution;

- Thirdly, filtering may be implemented at the institutional level on a network of a private company or public institution such as a school, library or an Internet café;

- Fourthly, filtering may take place by installing a piece of software at the level of an individual user. Many ISPs offer their customers the possibility to download such filtering products as a stand-alone software or as part of a broader security package. In some Council of Europe member states, ISPs are obliged to offer filtering products to their customers.17

30. Finally, a number of websites incorporate filtering options that can often be activated or deactivated by the user of the website. Examples of this are the SafeSearch filtering option incorporated by search engine site Google and the filter against copyright-protected material employed by social networking site MySpace. Such “self-filtering” solutions combined with the consumer labelling of content may become more prevalent in the so-called Web 2.0 era when content on websites is increasingly provided by the users themselves rendering traditional filtering methods based on blacklists and keywords less efficient.18

Users of filtering products

31. Filtering products are used in various private and public contexts. Although such products are often considered to be instruments of parental control they are also used in other contexts.

32. Surveys indicate that the use of filtering products by parents to control the Internet activities of their children is quite widespread. A Eurobarometer survey covering the period from December 2005 to January 2006 revealed that close to half of all polled parents (48%) use filtering products to block and/or filter access to certain objectionable Internet content. The survey also showed that filtering is used by the majority of parents of children between 6 and 9 years (59%) whereas the use of filters is less frequent among parents of adolescents between 14 and 17 years (37%).19

33. Filtering products are also used in schools to avoid that children are exposed to, for instance, sexually explicit content that is not suitable for their age.

34. Such products are also used in universities and libraries not only to protect the user against harmful content but also to ensure that the computer is used for the purposes intended, such as research, public information etc.

35. Private and public employers also often apply filtering products to restrict employee access to certain sites (e.g. gambling, leisure activities) or to avoid image problems due to the logging of IP-addresses, for instance on sites containing pornography.

36. Filtering may take place at Internet cafés to avoid that customers have access to explicit content that may disturb other customers.

37. Filtering may also be used as a means to execute a legal or contractual decision. In some Council of Europe member States, ISPs may be required by law,20 or by the police under contractual obligations,21 to filter a given website or to block certain Internet traffic on their network.22

III. Impact of technical filtering measures

38. The use of filtering products can potentially impact on the exercise and enjoyment of human rights. The impact is most obvious with regard to freedom of expression and information but filtering products may also impact on the enjoyment of other rights such as the right to privacy.

Impact with regard to Article 10 of the European Convention on Human Rights

39. Article 10, paragraph 1, of the European Convention on Human Rights sets out the basic right to freedom of expression and reads as follows:

Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This Article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises.

40. It follows from Article 10, paragraph 2, of the European Convention on Human Rights that any restriction on freedom of expression must fulfil three conditions:

    - the restriction must pursue one of the aims recognised as legitimate in Article 10,23
    - the restriction must be prescribed by law,
    - the restriction must be necessary in a democratic society.24

41. The filtering of content can be used to protect children and young people from the potential risks of the online environment. In this way, filtering products can constitute an appropriate means of encouraging access to and the confident use of the Internet.25

42. However, discussion on filtering should not be limited to whether filtering products are “good” or “bad” as a means of protecting children and young people. The question that should be asked is in what circumstances is the use of filtering products legitimate and how can such products be designed and used in a way that is proportionate and compliant with human rights.

43. As regards the circumstances and the extent to which the use of filtering products is legitimate, it is important to distinguish between cases where the user of the filtering product is a public actor (such as a public library or a public employer)26 and where the user is a private actor (such as a private employer, an ISP or an Internet café).

44. Public bodies on all levels, for whose actions the State is directly responsible under the European Convention on Human Rights, should comply with Article 10, paragraph 2, of the Convention. This also applies to private sector bodies acting on the instruction of the State.27 Nevertheless, in certain contexts, such as computers used for educational purposes in public schools, this may involve a legitimate need to install a filtering product. The same can be said for filtering by a public library or a public employer based on legitimate considerations (e.g. to protect minors against harmful content or to protect the image of the employer) but filtering in these situations could be more problematic because, with regard to freedom of expression, this would mean that the extent of the filtering must be strictly proportionate to the aim pursued.

45. Although private actors are not directly28 bound by Article 10 of the European Convention on Human Rights, the decision that a person may not use a computer to access certain Internet content can raise concerns regarding freedom of expression and may also require justification.

46. Regarding the design and use of filtering products and how these impact on freedom of expression, there seem to be three main issues: the over-blocking of content, the lack of transparency and the inability of the user to control the product.

Filtering products which over-block content

47. Although the quality of filtering products has improved over the last few years, no filtering product is 100 % effective in filtering unwanted content while at the same time ensuring that harmless information is always left unfiltered.

48. A recent survey of filtering products showed that most filtering is prone to error regarding “good” content in between 10 % and 30 % of cases.29 This problem is, for instance, seen in relation to websites containing serious information on sensitive subjects which in other contexts may be deemed inappropriate by the filter such as sites containing information on contraception, sexual education etc. Research on filtering products contains many examples of such sites being unintentionally blocked or filtered.30 One inherent problem31 is that in most cases filtering products are unable to take account of the context in which words or images occur which leads to misinterpretations of the character of the content.

49. This so-called “over-blocking” raises concerns and can challenge the right to freedom of expression and information. If a large proportion of content blocked by a filtering product is in fact harmless, the corresponding restriction on freedom of expression and information could be said to constitute an interference which is disproportionate to the aim pursued.

The lack of transparency of filtering products and the inability of the user to control them.

50. The right to freedom of expression and information is founded on principles of individual liberty and autonomy based on which users should have control over their own Internet experience and over the content received or filtered.

51. It is a pre-condition that the user is able to control the filtering product, both its installation and operation and its functioning which should be sufficiently transparent. The user should be aware that a filtering product is installed, he/she should, if appropriate, be able to deactivate it, understand how the product works, be able to fine-tune its settings and be informed when content is blocked; if these conditions are fulfilled, the user retains a degree of control over what content is received. The user can then decide whether or not the advantages of using a filtering product to block unwanted content outweigh the risks of over-blocking harmless content.

52. On the other hand, if the user is not made aware of the installation of a filtering product, nor is she/he informed when or why content is blocked, and is unable to influence the content blocked, the use of the filter may be seen as a means of private censorship that has an adverse affect on freedom to receive information. The less transparent and adaptable the product, the bigger the risk is that it may be used against the user’s wishes.

53. The degree of transparency and adaptability of filtering products are therefore key factors in ensuring the user’s self-determination and freedom of expression and information.

54. Some filtering products lack transparency and adaptability, while others do not inform the user that a filtering, blocking or logging product is active. Not all filtering products publish the list of sites which they block and some fail to explain adequately their filtering criteria.32

55. Similarly, some products suffer from a lack of adaptability by not including options to customise filtering/blocking. Filters do not always provide the option to add or detract from the list of blocked websites or keywords or to decide on the type of content that is blocked. In such cases users have a very limited ability to customise the product; rather decisions on content selection are outsourced to the producers of filters.

56. A transparently functioning and easily customised filtering product can, on the other hand, ensure that information selection decisions are made by the user.

57. Seen from the perspective of the author of Internet content, the lack of transparency of filtering products may also be problematic. The producers of filtering products do not notify owners of websites containing blocked or filtered content that their content has been blocked. This means that an author of content whose site has been blocked will more often than not be ignorant about the blocking of website content and consequently have very limited opportunities to rectify this. The producers of filtering products do not necessarily assist site owners or content authors who feel their content has been wrongly blocked to ask for a review of the “decision” taken to block or filter content. Consequently, an author of content has very limited opportunities to ensure that his or her right to freedom of expression is not unjustifiably curtailed by the overzealous application of filtering products.

Impact with regard to other relevant provisions of the European Convention on Human Rights

58. Although the main impact of filtering products concerns the right to freedom of expression and information, the enjoyment of other human rights may also be affected by the use of filtering products.

59. The right to private life, as guaranteed by Article 8 of the European Convention on Human Rights, may be affected if the filtering product monitors and logs the Internet use of an identifiable user. To the extent that this information is stored and can be accessed by other persons, for instance an employer33 or a librarian, this may create opportunities to eavesdrop on Internet use contrary to privacy rights. On the other hand, the log files of filtering products can be useful to review how the product works under the present settings and to identify needs for adjustment of the settings. Such log files can help to ensure that blocking decisions accord to the wishes of the user. A balance need to be found that allows the filter to be monitored and adjusted by the user without encroaching unduly on privacy rights.

60. Concerns regarding due process linked to Article 6 of the European Convention on Human Rights may be raised if there is no review procedure in cases where Internet content has been unjustifiably blocked. The fact that a filtering product blocks access to a site may de facto have an impact on the ability of a user or a producer of content to exercise his or her freedom of expression. There could therefore be a case for introducing a facility to allow users and, in appropriate circumstances, producers of content, to lodge a complaint, obtain a review and, in justified cases, restore access to blocked or filtered content.

IV. Proposals for Council of Europe action

61. The way filtering products offered by private actors are designed and used may adversely affect the extent to which freedom of expression and information and other human rights can be enjoyed by users and content providers.

62. Having regard to the case law of the European Court of Human Rights and the positive obligations of the State as the “ultimate guarantor of pluralism”, there is a role for the State to ensure responsible, “human-rights-friendly” behaviour of various stakeholders that design or use filtering products. Best practice guidelines could therefore be developed in a Council of Europe standard-setting instrument to help member states fulfil this obligation.

Building on existing Council of Europe standards

63. A starting point for formulating such guidelines should be the existing Council of Europe standards.34

64. Existing standards highlight that filtering products can play a positive role as instruments for empowering users to make informed choices on which content needs to be blocked or filtered and what content should be left unfiltered. This basic “empowerment aim” of filtering products was already recognised in 2001 when the governments of member states agreed to promote and strengthen self-regulation and user protection against illegal and harmful content by, inter alia, “encourag[ing] the development of a wide range of search tools and filtering profiles, which provide users with the ability of selecting content on the basis of content descriptors”.

65. In similar fashion, the ministers of member states declared in 2005 that “member states, with a view to protecting human rights, should promote self- and co-regulation by private sector actors to reduce the availability of illegal and harmful content and to enable users to protect themselves from both”.35

66. The underlying premise behind these standards is that filtering products should be instruments to assist users in deciding on content selection themselves and to protect users against illegal and harmful content.

67. In a new Recommendation36 adopted this year by the Ministers’ Deputies, it was recognized that there is a need to ensure transparency in the use of filtering products. The Recommendation encourages member states, the private sector and civil society to develop common standards and strategies to promote transparency and the provision of information, guidance and assistance to the individual users of technologies and services concerning inter alia “the blocking of access to and filtering of content and services with the regard to the right to receive and impart information”.37 Moreover, it follows from the Recommendation that the private sector and member states are encouraged to develop common standards and standards regarding, inter alia, “the rating and labelling of content and services carrying a risk of harm and carrying no risk of harm especially those in relation to children” and “the rating, labelling and transparency of filtering mechanisms which are specifically designed for children”.38

68. The two basic goals of filtering products should be to empower users to filter in a transparent manner. As has been mentioned above, not all filtering products provide the user with sufficient information about how they work and some contain too few options for the user to fine-tune the filtering settings.

69. The value added of a new Council of Europe instrument would therefore be to address these issues and to provide more specific guidelines to ensure that filtering is transparent and that filtering products can be adjusted to the specific needs of the user. Such guidelines could address both the design and the actual use of filtering products by different key actors.

Developing guidelines for key actors to ensure transparency and user empowerment

70. There are two main actors whose decisions regarding filtering products can impact on freedom of expression and other rights protected by the provisions of the European Convention on Human Rights.

71. Firstly, the producers of filtering products whose decisions on how filters operate may have a significant impact on the extent to which the users are informed and able to make their own choices about the kind of content to be filtered.

72. Secondly, users of filtering products (i.e. those persons who decide how and when the filter is applied), may, by their decisions, adversely affect the exercise and enjoyment of freedom of expression and other human rights. Such users would include relevant state actors such as law enforcement authorities and administrative bodies, ISPs, schools, libraries, universities, employers and possibly parents.

73. To ensure and promote the transparency of filtering products and user empowerment, best practice guidelines for the producers of filtering products could include the following recommendations:

    - filtering products should be designed in a way that allows the user to be informed when access to content is blocked or filtered;

    - filtering products should be designed and implemented (e.g. in web browsers) so that the default settings do not automatically activate filtering (filtering is activated by the user), or, at least, the user is informed of options to change the settings of the product to allow the blocked content to be displayed;
    - filtering products should be designed in a way that allows the user to be sufficiently informed of the filtering criteria and of the reasons why the product has blocked or filtered specific content;
    - filtering products should contain sufficient options to fine-tune the filtering settings; e.g. by allowing the user to add to or detract from the list of blocked or allowed sites, list of blocked or allowed keywords, categories of content to be blocked etc.;
    - filtering products should inform the user when there is an update of any part of the software, e.g. when the list of blocked sites is renewed;
    - filtering products should contain a log-file of all content which has been blocked or filtered in a given time period allowing the user to review if the settings are appropriate at all times;
    - filtering products should contain sufficient information and guidance for the user describing the way the filter works, the different filtering settings, criteria etc., empowering the user to make his or her own decisions on what material should be filtered. The information should be displayed when the filter is installed and be readily available at all times afterwards in a help file etc.;
    - the producer of the filtering products should have a policy that allows owners of blocked sites or other producers of content to ask for their site or content to be un-blocked if the filter’s classification of blocked content has been made in error;
    - filtering products should not allow their producers to log attempts by the user to access blocked content, unless these attempts are made anonymous.

74. To ensure transparency and user empowerment, best practice guidelines for the users of filtering products could include the following recommendations:

    - users should be made clearly aware at all times when filtering of content takes place;
    - filtering should only be applied to pursue a specific and legitimate aim such as protection of minors against harmful content and it should be ensured that the level of filtering is proportionate to the legitimate aim pursued and necessary in a democratic society;
    - filtering should be enabled by an informed user, or as a minimum a procedure should be in place to ensure that users have the possibility to request that the filtering is disabled. Users should receive clear information on the procedure and who they must contact to ask for the filtering to be disabled;
    - filtering products (e.g. through access to the log files) should not be used to monitor what content the user accesses or downloads, unless the monitoring is justified by specified and legitimate reasons, which should be proportionate and necessary in a democratic society, and the user is clearly informed beforehand;
    - filtering should be a complement to other strategies on how to tackle illegal or harmful content, in particular the development and provision of information literacy for the users, e.g. children.39

Appendix I

Council of Europe standards related to technical filtering measures

None of the existing Council of Europe standards on the information society have technical filtering measures as their sole or main focus although a few of those standards address the issue of filtering of Internet content.

Recommendation Rec(2001)8 on self-regulation concerning cyber content (self-regulation and user protection against illegal or harmful content on new communications and information services) contains principles on labelling (Chapter II – Content descriptors), filtering products (Chapter III – Content selection tools) and user empowerment (Chapter VI – User information and awareness).

Regarding labelling of content, paragraph 6 of the Recommendation asks Member States to encourage a set of content descriptors to be defined, which should “provide for neutral labelling of content which enables users to make their own value judgments over such content”. Paragraph 7 of the Recommendation makes reference to content descriptors indicating violent and pornographic content, content promoting the use of tobacco, alcohol or gambling services and content which allows unsupervised and anonymous contacts between minors and adults. Paragraph 8 of the Recommendation states that content providers should be encouraged to apply such content descriptors “in order to enable users to recognise and filter such content regardless of its origin”.

As regards filtering products, paragraph 9 of the Recommendation urges Member States to encourage the development of a wide range of search tools and filtering profiles based on content descriptors. Having regard to Article 10 of the European Convention on Human Rights, paragraph 10 of the Recommendation recommends that filtering of content should be applied by users on a voluntary basis. Filtering products can thus empower users, for example parents or other persons or institutions having responsibility over children, to make qualified choices about the type of lawful content should be accessible to children.40

Finally, paragraph 20 of the Recommendation urges Member States to encourage public awareness and information about, inter alia, content descriptors and filtering products. Such information should be accessible to all, for instance through educational institutions or public libraries, and information about filtering harmful content might especially be addressed to parents.41

Principle 3 of the Declaration on Freedom of Communication on the Internet, adopted by the Committee of Ministers on 28 May 2003, deals more specifically with blocking or filtering measures imposed by the State.

Paragraph 1 of the Declaration underlines the importance of the absence of prior state control over what the public can search for on the Internet. Although the State should by no means take broad measures to block undesirable content, exception must be allowed for the installation of filters for the protection of minors. Where minors have access to the Internet, for example in schools or libraries, public authorities may require filters to be installed on computers to block access to harmful content. Paragraph 2 of the Declaration concerns measures taken by the State to remove illegal content or block access to it following a preliminary or final decision of the competent national authority on its illegality under penal, civil or administrative law. It is underlined that such measures must be in accordance with the requirements of Article 10, paragraph 2, of the European Convention on Human Rights and would have to be directed at clearly identifiable Internet content.42

The Declaration on human rights and the rule of law in the information society, adopted by the Committee of Ministers on 13 May 2005, does not explicitly mention technical filtering measures although it follows from the Declaration that “member States should maintain and enhance legal and practical measures to prevent state and private censorship”.43 The Declaration also underlines that “member states, with a view to protecting human rights, should promote self- and co-regulation by private sector actors to reduce the availability of illegal and harmful content and to enable users to protect themselves from both”.44 Finally, the declaration encourages private sector actors to address, inter alia, the issue of “private censorship (hidden censorship) by Internet service providers, for example blocking or removing content, on their own initiative or upon the request of a third party”.45

Although Recommendation Rec(2006)12 on empowering children in the new information and communications environment contains no provisions on the use of technical filtering measures, it should be noted that it follows from the preamble of the Recommendation that “an essential part of the response to content and behaviour carrying a risk of harm lies in the development and provision of information literacy, defined as the competent use of tools providing access to information, the development of critical analysis of content and the appropriation of communication skills to foster citizenship and creativity, and training initiatives for children and their educators in order for them to use information and communication technologies and services in a positive and responsible manner”.

The new Recommendation Rec(2007)11 on promoting freedom of expression and information in the new information and communications environment, adopted by the Committee of Ministers at the 1005th meeting of the Ministers’ Deputies on 26 September 2007, provides a general framework for the adoption of more detailed standards on technical filtering measures. It encourages member states, the private sector and civil society to develop common standards and strategies to promote transparency and the provision of information, guidance and assistance to the individual users of technologies and services concerning, inter alia, “the blocking of access to and filtering of content and services with the regard to the right to receive and impart information”.46 Moreover, it follows from the Recommendation that the private sector and member states are encouraged to develop common standards regarding, inter alia, “the rating and labelling of content and services carrying a risk of harm and carrying no risk of harm especially those in relation to children” and “the rating, labelling and transparency of filtering mechanisms which are specifically designed for children”.47

The brand new Recommendation Rec(2007)16 on measures to promote the public service value of the Internet, adopted by the Committee of Ministers at the 1010th meeting of the Ministers Deputies on 7 November 2007, calls on member states to promote freedom of communication, regardless of frontiers, on the Internet, in particular by “not subjecting individuals to any licensing or other requirements having a similar effect, nor any general blocking or filtering measures by public authorities, or restrictions that go further than those applied to other means of content delivery”.48

The Parliamentary Assembly of the Council of Europe has expressed its general support for the development of filtering technologies to encourage self-regulation by network operators at international level.49 More specifically, in the context of the fight against racism and xenophobia in cyberspace, the Assembly has stressed the ethical responsibility of and need for self-disciplinary efforts by access providers and hosts to fight racism and xenophobia through inter alia the labelling and classification of sites and the establishment of filtering.50

Finally, the Internet Literacy Handbook includes a fact sheet on filtering and labelling containing information and best practice guidelines intended for teachers, parents and young people. The Handbook highlights a number of issues related to the use of filters such as the fact that “Filtering software-services label pages according to their value systems and social agendas” and that “Filters may block useful sites relating to contraception or sex education due to certain key words they contain”. As a best practice guideline the Handbook recommends to have a close look at how a filter works before installing it, to use electronic aids with discrimination and to allow access only to approved sites contained in “white lists” for the youngest Internet users.

Appendix II

Other material concerning technical filtering measures

European Union

The use of filtering measures has been encouraged by the European Union in two recommendations on protection of minors and human dignity from 1998 and 2006 respectively.

The 1998 recommendation51 urges Member States, the industries and other parties concerned to co-operate in developing national self-regulation frameworks, producing codes of conduct concerning protection of minors and human dignity and supporting the development and use of parental control measures, including filtering products. Regarding filtering products, it is underlined that such products should be easy-to-use and flexible in order to enable minors under the charge of parents and teachers to have access to services even when unsupervised. It is also mentioned that national codes of conduct should address basic rules on the conditions under which products such as filtering software installed and activated by the user and filter options activated at the user’s request, by service operators at a higher level, are supplied to users to facilitate parental control. On a more general level, it follows from the 1998 recommendation that codes of conduct drawn up should be proportional and be assessed in the light of, inter alia, the principles of freedom of expression, protection of privacy and free movement of services.

The 2006 recommendation52 again encourages the audiovisual and on-line information services industry and other parties concerned to develop positive measures for the benefit of minors, including initiatives to facilitate their wider access to audiovisual and on-line information services, while avoiding potentially harmful content, for instance by means of filtering systems. The role of filters in preventing information offending against human dignity from passing through the Internet, and the intent of the European Commission to provide information for parents about the effectiveness of filtering software, is specifically mentioned.

Studies

In addition to the two European Union recommendations mentioned above, an action line in the Safer Internet Plus programme is devoted to tackling unwanted and harmful content.

As part of this programme the European Commission is currently conducting a 3-year study, carried out by Deloitte and Katholieke Universiteit Leuven, to provide an expert, vendor/supplier independent and objective assessment of technical solutions to filter Internet content for children between 6 and 16 years (Safer Internet Plan – Benchmark of products to filter potentially harmful Internet content). A synthesis report, covering the first year of the study (2006), has been published. The report is not addressing the issue of filters from a freedom of expression perspective but nonetheless contain interesting finding regarding the effectiveness and functioning of filters. Some of the problem areas identified in the report were:

- The filtering products tested performed well in filtering content from sites with millions of hits per day, containing obvious content (read: pornography) expressed in a common language (read: English). However, when trying to filter less obvious but equally harmful content, expressed in a non-English language on for instance private sites, none of the tested products were capable of adequately filtering content.

- All products tested took a wrong filtering decision in more than 25% of the cases when tested against content deemed harmful for adolescents between 11-16 years. Most filters tested tended to block too little (missing bad content between 40% and 80% of cases) more often than filtering too much (blocking good content between 10% and 30% of cases).

- Some products give the user too few options to customise the filtering/blocking and provide little or no clarity on the type of words that trigger content to be blocked.

- Some filters don’t indicate to the user that filtering, blocking or logging is active.

Also as part of the Safer Internet programme, the European Commission has carried out a Eurobarometer survey from December 2005 to January 2006 on the use of the Internet by children, including the use of parental control measures. The survey was carried out in the then 25 EU Member States as well as Bulgaria, Romania, Croatia and Turkey. The survey shows that 48% of the respondents (parents and other caretakers of children) apply filtering or blocking products to avoid access to certain websites when their children use the Internet. According to the survey, the use of filtering is more widespread among parents living in the 15 “old” EU Member States (50%) than in the 10 “new” Member States (38%) and the accession and candidate countries (26%). The survey also shows that filtering is used by the majority of parents of children between 6 and 9 years (59%) whereas the use of filters is less frequent among parents of adolescents between 14 and 17 years (37%). Interestingly, a high percentage of parents think that filters are applied at school (31%) while only a small percentage of parents thought that filters where applied in libraries (3%) and Internet cafés (1%).

A relatively recent study on filtering products from a freedom of expression perspective has been conducted by the Brennan Center for Justice at New York University School of Law. The aim of this study was to provide a synthesis of studies on filtering products and initially led to the publication of a report in 2001 summarizing the results of more than 70 empirical studies on the performance of filters. An updated 2nd edition of the report was published in 2006 taking into account studies conducted since 2006. In the updated report it is concluded that filters continue to block large amounts of valuable information and that the widespread use of filters presents a serious threat to fundamental free expression values. The report recommends that schools and libraries choose filters that easily permit disabling and unblocking of wrongly blocked sites and that educational approaches to online literacy and Internet safety are developed as alternatives or complements to the use of filters.

Another on-going project is the OpenNet Initiative (ONI), undertaken through an academic partnership of four institutions: the Citizen Lab at the Munk Centre for International Studies, University of Toronto, the Berkman Center for Internet & Society at Harvard Law School, the Advanced Network Research Group at the Cambridge Security Programme, University of Cambridge, and the Oxford Internet Institute, Oxford University. ONI aims at investigating, exposing and analysing Internet filtering and surveillance practices worldwide. The project follows a multi-disciplinary approach that includes development and deployment of a suite of technical enumeration products and core methodologies for the study of Internet filtering and surveillance, capacity-building among networks of local advocates and researchers, advanced studies exploring the consequences of current and future trends and trajectories in filtering and surveillance practices and their implications for domestic and international law and governance regimes. The project has already produced tools and data on censorship of political, social, and conflict/security related content, as well as on forbidding the use of Internet tools and services. Reports include country profiles in all regions of the world.53

Note 1 This document was classified restricted at the date of issue. It was declassified at the 1022nd meeting of the Ministers’ Deputies (26 March 2008) (see CM/Del/Dec(2008)1022/5.1).

2 Paragraph 19 of the 7th European Ministerial Conference Action Plan reads: “Follow closely legal and other developments as regards liability for content made available to the public on the Internet and, if necessary, take any initiative, including the preparation of guidelines, inter alia, on the roles and responsibilities of intermediaries and other Internet actors in ensuring freedom of expression”. Paragraph 23 of the 7th European Ministerial Conference Action Plan reads: “Promote the adoption by member States of measures to ensure, at the pan-European level, a coherent level of protection for minors against harmful content in traditional and new electronic media, while securing freedom of expression and the free flow of information”.

3 Paragraph s 12 and 16 of Resolution No. 3 on Human rights and regulation of the media and new communication services in the Information Society, adopted by the Ministers of States participating in the 7th European Ministerial Conference on Mass Media Policy (Kyiv, 10-11 March 2005).

Note 4 According to Article 20 of the new Council of Europe Convention on the Protection of children against sexual exploitation and sexual abuse (CETS No. 201), Parties to the convention shall take the necessary legislative or other measures to ensure that, inter alia, the act of knowingly obtaining access to child pornography is criminalised in their national laws.
Note 5 According to Article 10 of the Convention on Cybercrime (ETS No. 185), Parties to the convention shall adopt such legislative measures as may be necessary to establish as criminal offences under their national laws, inter alia, infringements of copyright and related rights where such acts are committed wilfully, on a commercial scale and by means of a computer system.

6 Cf. Principle 3 of the Committee of Ministers’ Declaration on Freedom of Communication on the Internet, adopted on 28 May 2003 at the 840th meeting of the Ministers’ Deputies.

7 See Informationsverein Lentia and others v. Austria, judgment of 24 November 1993, Series A, No. 276, § 38.

Note 8 Most major search engines offer some type of filtering ability to avoid that pornographic or other unwanted content appearing in the search results.

9 See Yaman Akdeniz: Who Watches the Watchmen? The Role of Filtering Software in Internet Content Regulation, The Media Freedom Internet Cookbook, OSCE, 2004, p. 107.

10 A comprehensive collection of examples are given in the synthesis report “Internet Filters – a public policy report”, 2nd edition, Brennan Center for Justice at New York University School of Law, 2006.

Note 11 See http://www.fosi.org/icra

12 See http://ec.europa.eu/information_society/activities/sip/projects/targeted/filtering/index_en.htm#filtering

13 A benchmark study conducted under the Safer Internet Plan did not identify any products among the 30 surveyed that classified content using intelligent analysis, cf. Safer Internet – Test and benchmark of products and services to filter Internet content for children between 6 and 16 years, Synthesis report, 2006 edition, p. 24.

14 The following is based on the description of filtering at the OpenNet Initiative website: http://opennet.net/about-filtering

Note 15 See Principle 3 of the Committee of Ministers’ Declaration on freedom of communication on the Internet, adopted on 28 May 2003.

16 Child pornography filters at the ISP-level are mandatory in inter alia United Kingdom, Denmark and Italy.

17 In France, ISPs have since 2004 been obliged by law to inform their customers of the existence or to provide them with filtering software, see http://www.saferinternet.org/ww/en/pub/insafe/news/articles/0907/fr.htm

18 Safer Internet – Test and benchmark of products and services to filter Internet content for children between 6 and 16 years, Synthesis report, 2006 edition, p. 6.

19 Safer Internet – Special Eurobarometer, May 2006, p. 26.

Note 20 E.g. the French law on trust in digital economy (Loi n° 2004-575 du 21 juin 2004 pour la confiance dans l'économie numérique), article 6-I-8 and article 8-I.
Note 21 See above note 15.
Note 22 See also the decision of 29 June 2007 by the Tribunal de Première Instance de Bruxelles, SABAM c/ SA Scarlet (formerly Tiscali), in which a Belgium ISP was ordered to implement technical filtering measures in order to prohibit its users to illegally download music files. The decision is available online on http://www.juriscom.net/documents/tpibruxelles20070629.pdf

23 National security, territorial integrity or public safety, protection of health or morals, prevention of disorder or crimes, protection of the reputation or rights of others, prevention of the disclosure of information received in confidence or maintenance of the authority and impartiality of the judiciary.

Note 24 According to the case law of the Court this implies “a pressing social need” and involves a requirement that the restriction is proportionate to the aim pursued.
Note 25 At the same time, it should be noted that an essential part of the response to content and behaviour carrying a risk of harm lies in the development and provision of information literacy, see Recommendation Rec(2006)12 from the Committee of Ministers to member states on empowering children in the new information and communication environment.

26 For a recent example of a case before the European Court of Human Rights where the State was held accountable for the actions of a public employer, see Copland v. United Kingdom, judgment of 3 April 2007 (violation of Article 8 of the Convention)

Note 27 For instance, it is clear that when private use of filtering products is the result of interventions by State authorities (court orders, legislative actions, and/or administrative proceedings), these interventions would constitute a restriction on freedom of expression and access to information in the online environment and would have to follow the requirements of Article 10, paragraph 2, of the European Convention on Human Rights.
Note 28 According to the case law of the European Court of Human Rights, the State may in certain circumstances be held responsible under Article 1 of the Conventions for breaches of human rights committed by private parties (so-called indirect horizontal effect).

29 Safer Internet – Test and benchmark of products and services to filter Internet content for children between 6 and 16 years, Synthesis report, 2006 edition, p. 27.

30 See: Internet Filters – a public policy report, 2nd edition, Brennan Center for Justice at New York University School of Law, 2006

Note 31 See also the issues identified under the description of the different filtering methods employed (section II above).

32 Safer Internet – Test and benchmark of products and services to filter Internet content for children between 6 and 16 years, Synthesis report, 2006 edition, p. 27.

33 As shown by a recent judgment by the European Court of Human Rights, States may be held accountable for the monitoring of a public employee’s Internet use cf. Copland v. United Kingdom, judgment of 3 April 2007 (violation of Article 8 of the Convention)

34 See Appendix I to this report containing extracts from existing Council of Europe standards concerning filtering measures.

Note 35 Declaration on human rights and the rule of law in the information society, adopted by the Committee of Ministers on 13 May 2005.
Note 36 Recommendation Rec(2007)11 on promoting freedom of expression and information in the new information and communications environment, adopted by the Committee of Ministers at the 1005th meeting of the Ministers’ Deputies on 26 September 2007.

37 Guidelines, Part I, ”Empowering individual users”, point vi.

38 Guidelines, Part II, ”Common standards and strategies for reliable information, flexible content creation and transparency in the processing of information”, point i and ii.

39 Recommendation Rec(2006)12 on empowering children in the new information and communications environment.

40 See paragraph 37 of the Explanatory Memorandum to the Recommendation.

41 See paragraph 50 of the Explanatory Memorandum to the Recommendation.

42 See explanatory note to the Declaration on freedom of communication on the Internet.

43 See Part I, paragraph 1, ”The right to freedom of expression, information and communication”.

44 See Part II, paragraph 1, ”Council of Europe member states”.

45 See Part II, paragraph 3, ”Private Sector”.

46 Guidelines, Part I, ”Empowering individual users”, point vi.

47 Guidelines, Part II, ”Common standards and strategies for reliable information, flexible content creation and transparency in the processing of information”, point i and ii.

48 Appendix to the Recommendation, Part III, ”Openness”, paragraph 23.2, litra a.

49 Paragraph 5, point iii, litra o, of Recommendation 1332(1997) on the scientific and technical aspects of the new information and communications technologies, adopted on 23 June 1997.

50 Paragraph 5 of Recommendation 1543(2001) on Racism and xenophobia in cyberspace, adopted on 8 November 2001.

51 Council Recommendation 98/560/EC of 24 September 1998 on the development of the competitiveness of the European audiovisual and information services industry by promoting national frameworks aimed at achieving a comparable and effective level of protection of minors and human dignity, OJ 1998, L 270, p. 48.

52 European Parliament and Council Recommendation 2006/952/EC of 20 December 2006 on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and on-line information services industry, OJ 2006, L 378, p. 72.

Note 53 Tools and reports released by the ONI project are available at http://opennet.net


 Top

 

  Related Documents
 
   Meetings
 
   Other documents