CEPEJ_rev

Strasbourg, 7 December 2016

CEPEJ(2016)15

EUROPEAN COMMISSION FOR THE EFFICIENCY OF JUSTICE

(CEPEJ)

Handbook for conducting satisfaction surveys aimed at Court users in Council of Europe member States

As adopted at the 28th plenary meeting of the CEPEJ on 7 December 2016


Handbook prepared by the CEPEJ-GT-QUAL

on the basis of the preparatory work of Jean-Paul JEAN and Hélène JORRY, scientific experts (France), completed by Martial PASQUIER, scientific expert (Switzerland)

Table of contents

1.         Introduction- 3

1.1       Justice surveys- 3

1.2       Survey types- 3

1.3       Court users- 3

1.4       The approach proposed- 4

2.         Constructing a survey of court users- 5

3.         Survey steps- 5

3.1       Scope, objectives and organisation of the survey- 5

3.2       Determination of target groups and of user categories- 6

3.3       Choice of method- 7

3.3.1     Qualitative surveys- 7

3.3.2     Quantitative surveys- 7

3.3.3     Additional methods- 8

3.4       Selecting a sample- 9

3.5       The questionnaire and the organisation of the survey- 10

3.5.1     Management and administration- 10

3.5.2     Survey timing- 10

3.5.3     The questionnaire- 10

3.5.4     Questionnaire medium-- 11

3.5.5     Response scales- 11

3.6       Recording and analysing results- 12

3.7       Reporting results and determining measures to be taken- 12

Appendices- 13

Information for survey managers 13

Model questionnaire for court users 14

Model questionnaire for lawyers 20


1.          Introduction

  1. The CEPEJ's working group on the quality of justice (CEPEJ-GT-QUAL) proposed to draft a methodological handbook for central court authorities and individual courts wishing to develop user satisfaction surveys. This tool was to be based in particular on the experience of certain member States and ensuing best practices.

  1. The Checklist for Promoting the Quality of Justice and the Courts adopted by the CEPEJ in July 2008 (CEPEJ(2008)2) is an essential point of reference for this work.

  1. Satisfaction surveys are a key element of policies aimed at introducing a culture of quality. Taking into account public-satisfaction reflects a concept of justice focused more on the users of a service than on the internal performance of the judicial system.

1.         Justice surveys

  1. Methods of conducting surveys on justice at European level vary considerably: trend recording and ordinary public opinion and image surveys ((such as the European Union’s “Eurobarometer”), more qualitative surveys of sample groups of users, and surveys to assess the satisfaction of actual users. Moreover, these surveys, conducted either by publicly funded bodies or by private organisations, may be regular or intermittent and are sometimes employed merely to measure the impact of a high-profile case or identify characteristics often associated with the justice system, such as delays, costs or the lack of intelligibility of decisions.

  1. In this connection, the CEPEJ is interested more particularly in regular surveys carried out on the basis of tried and tested questionnaires in order both to measure changes in the evaluation of services provided and to tie the justice system into a process of systematically improving the quality of the services offered. Moreover, the CEPEJ is focusing on court users' evaluations based on their own experience. Its aim is not to carry out surveys of representative samples of the population, the results of which can be no more than perceptions of justice and will not directly enable improvements to the services provided.

1.2       Survey types

  1. Several survey types can be employed to question court users:

§  Opinion surveys. These involve asking people to give their opinion of preference on a particular subject, such as How much do you trust the justice system? or What is the image of the justice system?

§  Quality surveys. These involve studying the extent to which a service lives up to the promise made: How quickly did you receive the summons?

§  Satisfaction surveys. These measure the extent to which a service lives up to users’ expectations: Are you satisfied with the signposting at the courthouse?

  1. These types of question can be grouped together within the same questionnaire but measure different things. Above all, the use that can be made of the results differs considerably. Although a court’s signposting system can be improved to increase user satisfaction, it is not possible to take direct action to improve the level of trust in the justice system.

1.3       Court users

  1. Various categories of users can be specified:

§   People involved in a case for various reasons: in criminal cases as victims or perpetrators, witnesses or jury members; in civil cases as claimants or defendants; in administrative cases, as applicants or interveners. The perception of the courts’ performance in terms of reception of the public, the length or cost of proceedings is important, as is the perception of the input of all those involved, first and foremost judges, lawyers and court staff. All aspects must be considered, for instance the fact that the individuals surveyed might have won or lost their civil cases. Specific categories of user may also be surveyed, particularly victims of offences.

§  Legal professionals, with a distinction being drawn between:

-        professionals belonging to the public justice service, such as judges, public prosecutors and non judge and non prosecutor staff belonging to the courts and the public prosecution services

-        professionals who are essential partners of the courts, especially lawyers.

  1. It is also always possible to conduct surveys among key players such as bailiffs, notaries, expert witnesses and interpreters, as well as public sector employees and associations working directly with the courts to prepare or enforce judges’ decisions (social workers, police, probation staff, prison officers, etc.). This type of survey, based on interviews and self-administered questionnaires, can be used either to explore a specific issue or supplement the main survey in the event of a comprehensive evaluation of the system's operation.

1.4     The approach proposed

  1. There are many methods of studying how people assess a service: qualitative and quantitative surveys (interviews, self-administered questionnaires, telephone surveys) and on-the-spot observation. The choice of a method and the frequency of the survey depend on several elements:

§  objectives (monitoring user satisfaction, measuring court performance, improving service delivery, reforming the judicial system);

§  scope (a service area, a court, several courts of the same type, several courts in the same geographical district, etc.);

§  target groups: court users (all users of a particular court, certain users such as victims, persons involved in divorce proceedings, etc.), professionals (in the categories referred to above);

§  human, technical and budgetary resources available to the survey sponsor.

  1. With this handbook, the CEPEJ wishes to propose an inexpensive and approved "basic product" that is easy to use and focuses on the fundamental problems and issues of the operation of the courts. A tool of this kind is intended to be widely distributed to the courts of member States and its use should entail little cost for the latter.

  1. A second level of need might concern a more sophisticated multi-entry product, which could be adapted to specific judicial cultures, the problems anticipated and the amount of money available.

  1. Therefore a model multi-entry survey of actual court users is proposed here, accompanied by a methodological guide making use of trials already run in a number of member States and of the CEPEJ's work addressing the substantive issues involved. The aim is to develop an operational tool within an overarching approach to improving the quality of justice. It takes the form of an adjustable kit with a standard model that can be adapted by users according to their needs, resources and priorities.

For a more detailed presentation and analysis of existing European survey systems, see the detailed report by Jean-Paul JEAN and Hélène JORRY – Document CEPEJ(2010)2, available on www.coe.int/CEPEJ.

2.       Constructing a survey of court users

  1. For a comprehensive approach to quality assessment, it would be advisable before conducting a user survey to carry out a qualitative survey using individual interviews, group meetings or on-the-spot observation of behaviour, supplemented by an analysis of correspondence and complaints (examples: the Polish Ombudsman,[1] the Court of Grasse in 2002,[2] the French Ombudsman (Médiateur)), in order to define more clearly the target group, the scope and the methodology of the survey and to involve stakeholders in the assessment procedure.

  1. However, given the cost and the resources required, a qualitative survey of this sort is not automatically necessary; the tool proposed by the CEPEJ draws on good practice in member States and can be adapted to specific local features after a few consultation meetings. The following steps are proposed:

§    scope, aims and organisation of the survey,

§    determination of user categories,

§    choice of method

§    selection of a sample,

§    questionnaire and organisation of the survey,

§    recording data and analysing results,

§    reporting results and determining measures to be taken.

  1. In some courts, following the example of the law courts in the cantons of Berne and Geneva (Switzerland), a steering committee has been set up. Such a committee, which may be internal or external to the court concerned and consists of court professionals, court users and external specialists (academics, researchers, etc.), makes it possible to supervise the survey, adapt the final version of the questionnaire to the court’s needs and aims and make arrangements for its implementation. The CEPEJ recommends setting up a local steering committee so as to involve key justice system players and make it easier to conduct surveys.


3.         Survey stages

3.1       Scope, objectives and organisation of the survey

  1. The first step is to define as clearly as possible the scope of the survey and its intended aims and to put in place the organisation necessary to carry it out. This step is very important, firstly in order to enable the survey to be conducted in a professional manner and secondly to match the aims of the survey with tangible means available for introducing improvements. Aims that are over-ambitious to begin with can subsequently prove a drawback if problems identified during the results analysis phase cannot be the subject of measures that enable services to be improved. In such a case, it is better to conduct a more limited survey that is therefore less ambitious but whose results will definitely bear fruit.

  1. As far as the scope is concerned, it is important to carry out the following tasks before conducting the survey:

§    gathering existing information from previous surveys or other surveys. This information can be very useful when designing the survey;

§    identifying the human and financial resources available to plan and, above all, conduct the survey: the resources have a potentially significant impact on the choices subsequently made, so it is important to know what resources the individuals in charge of the survey will have at their disposal;

§    setting the survey timetable: this involves establishing the timeframes for the main stages of the survey;

§    determining the communication policy: such surveys often raise questions or even trigger distrust both internally and externally. It is recommended that the users concerned be clearly informed in advance in order to avoid any hostile reaction to the survey and the results.

  1. Putting in place a survey also requires that the aims be clarified first, for example:

§    identifying any problems and grounds for satisfaction or dissatisfaction,

§    identifying actual and potential expectations,

§    understanding changes in users’ expectations,

§    measuring the level of satisfaction in connection with the service provided (environment, costs, length of proceedings, reception, etc.).

  1. Depending on the judicial system and the general aims of the survey, other subjects not directly linked to user satisfaction with the courts, such as their opinion or degree of trust in the justice system, can clearly also be included.

  1. As far as this stage is concerned, it should also be determined whether the survey is being carried out on an ad hoc basis or as part of a predefined series. In the latter case, it is particularly important to ensure that all or some of the previous questions are repeated in order to be able to compare changes in results.

  1. In addition, this stage makes it possible to ascertain whether the conduct of the survey and the analysis of its results will require the establishment of a steering committee or the involvement of an independent outside body. A research laboratory or university team may well be interested in working in partnership with the court. If a private company is used, the relevant cost must be taken into account.

3.2       Determination of target groups and user categories

  1. In the case of national opinion surveys, such as those conducted in Belgium, France and Spain, the entire population can be questioned and specific groups can be set up on the basis of socio-demographic factors (age, gender, faith, etc.), language and geographical representativeness and whether individuals are actual court users or not.

  1. On the other hand, satisfaction surveys in courts must be conducted among actual users of the court concerned. The identity of the individuals questioned may depend on the particular service to be surveyed. Depending on whether it is intended to conduct a survey on the operation of the court as a whole or only on some of its services or on certain types of court (such as family courts), either all or only some users will be chosen. The following specific categories of user should therefore be considered:

§  parties: individuals undergoing trial are one category of users of the public justice service. Some countries such as Canada, the Netherlands and Switzerland, use the label “customer/client” over and above its commercial meaning to describe the individual receiving the service delivered (consumer, client, beneficiary, etc.);

§  lawyers: registered with the Bar association of the court concerned, or outside its district but occasionally acting for clients there;

§  various professionals belonging to the court and the public prosecution service: judges, Rechtspfleger, clerks, court officials, members of the prosecution service, etc.;

§  legal professionals in most frequent contact with the court concerned (notaries and bailiffs);

§  other professionals frequently called upon to assist the courts, whose contribution substantially affects the quality of justice: expert witnesses and interpreters.

3.3       Choice of method

  1. There are a number of methodologies that enable user satisfaction to be studied depending on whether the aim is to understand the grounds for satisfaction or dissatisfaction (conduct of qualitative surveys) or to measure satisfaction with specific aspects (conduct of quantitative surveys).

3.3.1    Qualitative surveys

  1. Qualitative surveys are more exploratory in nature and can be used to identify trends in user satisfaction/expectations. More generally, they can often provide very useful information, which can then be studied as part of a quantitative survey.

  1. Various methods can be used:

§  individual interviews to record opinions and understand users’ motives, with a view to preparing a questionnaire

§  interview with a group of users (group interview) to record their experience and compare their viewpoints

  1. This is admittedly a costly and time-consuming method requiring specialist interviewers, but it is necessary for an overarching quality-based approach (Netherlands). Combining a preliminary qualitative survey with a quantitative survey makes it possible to achieve the greatest possible detail and most comprehensive coverage when studying user satisfaction and/or expectations.

3.3.2    quantitative surveys

  1. Quantitative surveys measure user satisfaction statistically on the basis of a representative sample if the number of users is large.

Various methods can be used:

§  Self-administered questionnaires within the courts

Example: a questionnaire made available at the court’s reception desk or on leaving the hearing (Netherlands (user surveys), Switzerland (Berne), United Kingdom, United States).

  1. This is the cheapest method of obtaining a very good response rate. In the first user surveys in the Netherlands, questionnaires sent out by post or administered by telephone had a response rate of between 10% and 20%. A change in administration method (interviews conducted with users as they leave the hearing) increased the response rate to 70%. However, distribution of the questionnaire immediately after the hearing may also entail certain risks. Firstly, those questioned might not feel completely free to reply as they wish because of the close presence of court staff and other users. Secondly, the potentially emotional character of such a situation may lead these individuals to reply in terms that are too emotive, whether positive or negative. It is therefore recommended that self-administered questionnaires be used within a court solely for surveying users who are not too affected by or involved in judicial decisions and for simple aspects relating to user satisfaction. The questionnaire should consequently not be too long and the survey duration should be limited to a few minutes. If the process is likely to exceed 5-7 minutes, it is recommended that a separate room be made available in which respondents can sit down and take the time to reply to the questions.

§  Self-administered postal or internet questionnaires

  1. This method is less expensive, but the response rate may be low without a special awareness-raising campaign. The electronic questionnaire, sent by e-mail or posted on a dedicated website (examples: the Netherlands 2009 survey into problems of access to the courts and the surveys conducted in Switzerland (Geneva), the United Kingdom (registry users and jurors), Canada and the United States), selects a specific category, Internet users, which obviously affects representativeness (age, socio-cultural level, etc.). But this method of distribution is recommended for direct surveys of professionals, with excellent response rates if appropriate explanations and guarantees are given to the addressees, as in the Netherlands (surveys of professionals) and France (2008 survey aimed at judges and prosecutors).

  1. This method nonetheless entails the use of a data file covered by domestic legislation on personal data protection.

  1. It is difficult to provide any indications as to the recommended maximum time for completing the questionnaire. It generally increases with the level of competence of the categories of users surveyed, but no more than 15-20 minutes should be necessary to reply to paper or screen-based questionnaires.

§  Telephone questionnaires

  1. This method is more time-consuming and entails recourse to a polling agency and/or specialist interviewers to administer the questionnaire by telephone. It is therefore expensive but can be used to construct representative samples and refine analysis and the degree of detail in replies (examples: Austria, Belgium, Finland (2008), France (2001 and 2008 user surveys, 2006 victim survey), Netherlands (initial court-user surveys) and Spain - 2008 judicial career survey). Care should be taken when conducting telephone surveys to ensure that the time taken for an interview does not exceed 10-15 minutes unless a telephone appointment has been arranged in advance.

§  Home or in-court interviews

  1. This method entails use of a questionnaire and face-to-face interviews. Since it necessitates recruitment of interviewers and recourse to a specialist body, it is more expensive (examples: Austria, France (1997 survey), Germany, the Netherlands (2009 survey into problems of access to the courts) and Spain - regular survey and 2001 survey). It is, however, the most suitable method if the subjects dealt with are complex or give rise to intense emotions. As a rule, personal interviews require more time. Unless qualitative aspects are surveyed (such as the recording of users experiences [Critical Incident Technique]), the length of personal interviews should not exceed 30 minutes.

  1. In Italy (Turin, Catania, Rome) between 2011 and 2016, a number of in-court surveys have been conducted by using group of university students as interviewers. Of course the cost of the survey may be reduced as the students can do the activity as on-field training but at the same time not using professional interviewers may cause a decrease of quality.

3.3.3    Additional methods

a)  On-the-spot observation

  1. In some cases, it may be recommended to observe users’ behaviour rather than asking them questions. This method, which is interesting as it does not call for individuals to indicate their opinion or perception, can only be employed for the relatively simple and quantifiable aspects of a service (waiting times, facilities available, etc.).

b)  “Intervision” or peer review

  1. Based on reciprocity, the peer-review method (or “intervision”) consists in having judges assess each other outside the hierarchical framework. Imported from the Netherlands, the technique relies on a pair of judges observing each other in the actual course of their work in order to improve professional practice. This is very much part of a comprehensive approach to quality assessment and improvement.

  1. It might, however, be a useful complement to the “judges” questionnaire, following the example of the Netherlands, where it forms part of the RechtspraaQ quality system. Some French courts have begun to develop this practice over the last ten years.[3]

c)  “Mirror surveys”

  1. “Mirror surveys” consist in getting court staff to assess the level of user satisfaction or encouraging them to look at their own work (examples: 2008 survey by the French High Council of the Judiciary; Romanian survey on the independence of the judicial system).

  1. This method can be used to compare the satisfaction rate expressed by users with the rate as perceived by court staff. It also has the advantage of involving the latter more closely in the assessment process.

d)  “Mystery shopping”

  1. Mystery shopping is a technique increasingly being used in areas of activity concerned with customer satisfaction and quality development. The “mystery shopper” is a person sent by a specialist outside firm who poses as a customer in order to measure the standard of service and reception. This person is given specific assessment criteria, which will be sent to the survey sponsor, often in the form of a questionnaire. Although the practice is still uncommon in courts, some countries, such as Ireland, have used it to measure the quality of relations between court staff and users, as well as surveying the working environment. Sponsored by the Courts Service of Ireland, these ”mystery shops” carried out within court premises, by telephone and by email have produced positive results regarding reception by court staff and the latter’s availability.

3.4       Selecting a sample

  1. The question that arises at this stage is which people to question. Two principal situations need to be considered:

§  where the number of users is limited and known: if the number is limited, it is perfectly possible to question them all. This solution offers the advantage of obviating the need to constitute a sample and, in particular, of not giving the impression that the opinion of some people might count more than that of others. This applies for example when the users questioned are lawyers and certain court officers;

§  where the number of users is large and/or they are not all known: it is then not possible to question all users and it is necessary to select a sample.

  1. Two main solutions must be taken into account: selection of a random sample and putting together a quota-based sample.

  1. When a list of everyone belonging to the universe of users covered by the survey is available either directly or indirectly, those ultimately to be questioned can be randomly selected (for example, every Xth person who enters the building during the randomly picked days), or else a number of individuals can be chosen at random from the list available.

  1. In most cases, such a list is not available, so it is recommended that groups of users be chosen first on the basis of criteria such as the type of proceedings, type of player, age, gender, etc. In this case the term “quota-based sample” is employed. The table below shows the structure of such a sample. taking the type of proceedings, gender and age as the criteria applied:

Civil proceedings

Criminal proceedings

Respondent

Claimant

Defendant

Complainant

Man

Under 30

31-50

51 and over

Woman

Under 30

31-50

51 and over

  1. Once the structure of the sample has been defined, it will be necessary to determine the minimum number of people that it is intended to question by sub-group (recommendation: minimum of 30 persons per sub-group). The interviewers will then recruit these individuals in such a way as to fill the entire table (example: finding a woman aged 51 or over who is a respondent in civil proceedings, and so on).If the quota technique is employed, it will clearly be necessary to take into account the fact that the structure of the sample is not proportionate to the actual situation and that the results may have to be weighted. For instance, if 30 people are taken in each sub-group, this number does not correspond proportionately to the actual situation (there may be many more men aged at least 30 who are defendants in criminal proceedings than women aged over 51, in which case the sub-groups will have to be weighted according to their actual size).

  1. As far as setting up a sample is concerned, it is very important to note that the size of the sample will mainly depend on:

§  the margin of error accepted: this is the positive or negative deviation that you allow on the results of a survey;

§  the confidence level desired: this is the degree of certainty of the margin of error (often a 95% confidence level is applied). If a higher confidence level is desired, the size of the sample will have to be increased;

§  the response rate: account must be taken of an estimated response rate; if the response rate is low, it is recommended to increase the size of the sample so as to have sufficient replies that can be analysed;

§  the number of sub-groups to be put together: as the size of a sub-group should not be lower than 30, the size of the sample will also depend on the number of sub-groups to be studied.

3.5     The questionnaire and the organisation of the survey

3.5.1    Management and administration

  1. Court staff should already be involved at the preparatory stage by establishing a steering committee (see above).

  1. Use of outside bodies such as polling companies (as in France, Romania, the United Kingdom, Canada), external consultants (as in Austria, Ireland, Spain, Switzerland, Canada) or, if they exist, independent bodies responsible for producing performance measurement tools such as satisfaction surveys (as in the Netherlands (the Prisma agency) and the United States – National Center for State Courts)), for the administration, or even design, of the questionnaire and processing of the results will make the procedure more professional. This will, however, depend on the resources available to the court. A partnership with university and/or research teams seems the best solution (as in Albania, Finland and Spain).

3.5.2    Survey timing

  1. User availability is a key factor. This will determine whether it is better to send the questionnaire with the summons, make it available at the reception desk as users leave the hearing room or the court building, post it on the court’s website or send it by ordinary mail or email.

  1. In any case, it is essential that the court concerned should inform users beforehand to enable them to feel part of the survey procedure.

3.5.3    The questionnaire

  1. The questionnaire must be accompanied by a preliminary notice indicating the survey sponsor and aims. This notice must point out that anonymity safeguards will be respected and supply information on the ethical principles applying to use of the data provided.

  1. The content of the questionnaire will be largely determined by the service area or aspects of a service that you wish to evaluate (reception, speed, efficiency, accessibility, etc.). It must bring to light user perceptions of the court concerned and thus enable its strengths and weaknesses to be identified in order to review service targets and fine-tune methods of service delivery.

  1. The questionnaire should start with an introductory section containing simple questions to gain the user’s trust.

  1. Next, the main themes of the questionnaire should be arranged under headings, starting with the general perception of the service and going on to more specific aspects, such as access to information, court facilities or court operation (reception, contact with judges and public prosecutors, etc.). The various themes selected should consist of series of items alternating simple questions with more sensitive questions.

  1. The form of the questionnaire must be such that it can be adapted for all courts in Council of Europe member States. It should usually consist of easy-to-process closed-ended questions or statements, which can be accompanied, if appropriate, by open-ended questions for users to convey their opinions on matters that they think important and which might not have been addressed by the survey. However, the number of open-ended questions should be limited in order not to complicate the processing.

  1. The questionnaire must include a fixed section concerning key indicators common to all courts in Europe and easily tailored, as required, to procedural needs. It may also include variable sections to reflect specific features of different local and judicial cultures and to explore what court managers consider to be crucial problems.

  1. Finally, the language used must be clear (short sentences, no ambiguity), neutral (no negative sentences or emotive words) and easily understood by all court users in Council of Europe member States. Translations of the standard questionnaire must therefore be careful to include the most appropriate terms in each national language.

3.5.4    Questionnaire medium

  1. The questionnaire may be administered in paper form or electronically using public access terminals. It can also be produced in an electronic format that is easy to process with a spreadsheet.

3.5.5    Response scales

  1. Various response scales are possible. Some scales ask the user to choose an item (questions along the lines of “Select from the following replies …”, as used in the satisfaction survey conducted by the Supreme Court of Canada).

  1. Some rely on ranking of set answers (e.g., “Rank the following replies from 1 to …”). Other scales can be used to obtain simple replies through a binary rating (“Satisfied/Dissatisfied”; “Yes/No”) or more detailed user preferences through a broader rating (a 0-to-10 scale on the pattern of the user questionnaires of the Spanish General Council of the Judiciary (2001) or a satisfaction scale ranging from “Very satisfied/Strongly agree” to “Very dissatisfied/Strongly disagree” as in the response scales for the British and US surveys).

  1. Of special interest are surveys such as those conducted in the courts of the canton of Geneva (Switzerland) that make it possible to measure the gap between user expectations and user satisfaction for each item using a dual assessment (importance and satisfaction).

3.6     Recording and analysing results

  1. Generally speaking, in most surveys the services of an outside body are enlisted for recording and analysing data, firstly to guarantee the anonymity of replies and secondly to analyse the results and present the survey and, perhaps, any recommendations.

  1. However, it is essential that the steering committee should also identify contact persons within the court (for example, court staff made available for this purpose) to provide methodological assistance for users where necessary. The close involvement of court staff in this process is vital.

  1. Depending on the survey timetable, it is necessary to agree on the frequency with which responses are to be collected, whether the questionnaires are lifted from a box provided at the reception desk or received by post or email. This makes it possible to obtain a comparison of satisfaction rates according to the period during which responses are collected (day, week, month, etc.).

  1. It must be possible to enter the replies to the questionnaire in the appropriate software, whether it be spreadsheets for simple surveys (such as Microsoft Office Excel) or statistical software for surveys requiring more complex analyses (a very large number of commercial software programs [SPSS, Statistica, etc.] or free programs [SAS University Edition, Tanagra, PSPP, etc.] are available on the internet[4]). This applies in particular when the intention is to show the gap between user expectations and satisfaction. When following the traditional approach of first questioning people about their expectations and then asking them to give their actual assessment of an aspect of the service (such as the information provided), it is recommended that appropriate software be used in order, for example, to measure the effect of one or more aspects of satisfaction on overall satisfaction (a big difference with regard to a specific aspect may very well have either a very considerable impact on overall satisfaction or virtually none at all. For example, the fact that no parking spaces are available in the immediate vicinity of the court building may be strongly criticised but ultimately have no effect on the overall assessment of the court services provided).

3.7     Reporting results and determining measures to be taken

  1. Organisation and communication of feedback are an integral part of the survey process. This should be addressed within the framework of a court plan and quality-based approach. This entails setting up a follow-up committee to disseminate the survey results (in the form of a report presenting both the survey and the results obtained) and draw conclusions, especially in terms of identifying priorities for action. Failure to take any measures following a survey which has identified problems may lead to frustrations on the part of the individuals concerned and ultimately result in their reluctance to participate in new surveys.

  1. Communication must take place both in-house (oral presentation, discussion meetings), to involve staff in seeking practical solutions, and with regard to users (thank-you letters, information campaigns, results displayed in the court’s reception area, etc.), who are thus informed about, and even involved in, any undertakings to make improvements.

  1. Ad hoc surveys should not be considered sufficient for this purpose, and the process must be repeated regularly to measure changes in satisfaction levels. Media coverage of the process and the results helps to strengthen and promote the court plan and obtain outside backing for its inception.

  1. Another point to remember is the need to fully and precisely document the survey carried out. It is unfortunately not uncommon for a survey and the corresponding data to be no longer accessible or to have even gone astray several years later. Courts’ archiving services can help ensure that all surveys conducted are properly documented and made accessible for future work.


INFORMATION for Survey managers

The shaded parts are optional.

The basic questionnaire, made up of 20 closed-ended questions and one open-ended question, constitutes a standard format common to all courts in the Council of Europe member States.More specific or locally oriented questions can be added in the second section, for which a number of models are suggested. It is important to note that a usable survey must comprise a limited number of questions that users can answer quickly.

Arrangements for distributing and returning the questionnaire. Several wordings can be used:

1) If distributed within the court

Please answer this questionnaire, then place it in the box provided at the court’s reception desk, using the sealed envelope provided.

2) If sent with the court summons

Please answer this questionnaire and return it to the address on the postage-paid envelope.

Note: if the questionnaire is made available by electronic means:

You may submit your reply online to the website address appearing on the document. This site is secure and your anonymity is guaranteed.


MODEL QUESTIONNAIRE FOR COURT USERS

Dear Sir/Madam

This questionnaire is part of an assessment of the quality of the justice system, focusing more specifically on the quality of services and operation of the [type of court] in ………………………..

Your opinion and suggestions are important to us and we would be grateful if you would take a little time to reply to the questions below. The questionnaire is anonymous and we guarantee that your replies will be dealt with in the strictest confidence.

Please tick the appropriate boxes

1.   In what capacity are you [were you] at the court in ……………………………..

r1 As a party to proceedings

r2 As a witness

r3 As a member of the jury

r4 Other (e.g. family of one of the parties, requesting information, visitor, etc.)

2.   On what type of procedure was the case for which you went to the court based?

r1 Civil procedure

r2 Administrative procedure

r3 Commercial procedure

r4 Labour law

r5 Criminal procedure

r6 Other (minors, guardianship, pensions, register, etc.). Please specify: ………………………………………………………..

r7 Don’t know

[If the questionnaire is specifically intended for users of court registry services]

2a Which court registry services have you used in the course of the past year?

r1 Information on legal aid services

rInformation on forms of legal action

r3 Access to documents (e.g. copy of evidence)

r4 Information on the court’s decisions

r5 Information on the execution of decisions

r6 Other; please specify: ………………………………………………………..

2b. What means of communication have you used to contact the court registry?

r1 in person

r2 post

r3 telephone

r4 fax

r5 email

r6 online via the court’s website

3.   Were you assisted by a lawyer?

r1 yes

r2 no

[optional question]

4.   What level of confidence do you have in the justice system?

Very low confidence

Low confidence

Average confidence

High confidence

Very high confidence

r1

r2

r3

r4

r5

5. If you were a party, and the decision was delivered, did the court find partially or fully in your favour?

r1 yes, fully

r2 yes, partly

r3 no

r4 I was not a party

6. Were the hearings held in your mother tongue?

r1 yes (go to 8)

r2 no

7. If the hearing was not held in your mother tongue, were you given an interpreter?

r1 yes

r2 no

7a. Was the conduct of the oral proceedings in …………………. (language) a disadvantage for you?

r1 yes

r2 no


8. Assess the importance you attach to the following elements:

Not important

Not very important

Average importance

Important

Very important

No reply

8.1 Conditions of access to the court

r1

r2

r3

r4

r5

r6

8.2 Signposting in the court building

r1

r2

r3

r4

r5

r6

8.3 Waiting conditions

r1

r2

r3

r4

r5

r6

8.4 Courtroom furnishing

r1

r2

r3

r4

r5

r6

8.5 Clarity of summonses

r1

r2

r3

r4

r5

r6

8.6 The time lapse between the summons and the hearing

r1

r2

r3

r4

r5

r6

8.7 Punctuality of hearings

r1

r2

r3

r4

r5

r6

8.8 Attitude and courtesy of court staff

r1

r2

r3

r4

r5

r6

8.9 Level of competence of non-judicial court staff

r1

r2

r3

r4

r5

r6

8.10 Attitude and courtesy of judges and prosecutors

r1

r2

r3

r4

r5

r6

8.11 The language used by the judges and prosecutors

r1

r2

r3

r4

r5

r6

8.12 Time allowed to set out your arguments at the hearing

r1

r2

r3

r4

r5

r6

8.13 Timeframe for the delivery of judgments

r1

r2

r3

r4

r5

r6

8.14 Clarity of judgments

r1

r2

r3

r4

r5

r6

[Optional elements:]

8.15 Information provided by the court’s information service

r1

r2

r3

r4

r5

r6


9. Assess your degree of satisfaction with regard to the following elements

Not satisfied

Not very satisfied

Average satisfaction

Satisfied

Very satisfied

No reply

9.1 Conditions of access to the court

r1

r2

r3

r4

r5

r6

9.2 Signposting in the court building

r1

r2

r3

r4

r5

r6

9.3 Waiting conditions

r1

r2

r3

r4

r5

r6

9.4 Courtroom furnishing

r1

r2

r3

r4

r5

r6

9.5 Clarity of summonses

r1

r2

r3

r4

r5

r6

9.6 The time lapse between the summons and the hearing

r1

r2

r3

r4

r5

r6

9.7 Punctuality of hearings

r1

r2

r3

r4

r5

r6

9.8 Attitude and courtesy of court staff

r1

r2

r3

r4

r5

r6

9.9 Level of competence of non-judicial court staff

r1

r2

r3

r4

r5

r6

9.10 Attitude and courtesy of judges and prosecutors

r1

r2

r3

r4

r5

r6

9.11 The language used by the judges and prosecutors

r1

r2

r3

r4

r5

r6

9.12 The time allowed to set out your arguments at the hearing

r1

r2

r3

r4

r5

r6

9.13 Timeframe for the delivery of judgments

r1

r2

r3

r4

r5

r6

9.14 Clarity of judgments

r1

r2

r3

r4

r5

r6

[Optional elements:]

9.15 Information provided by the court’s information service

r1

r2

r3

r4

r5

r6


[[Optional questions:]

10. In general terms, what is your assessment of the operation of the courts?

Too opaque

opaque

Clear

Very clear

r1

r2

r3

r4

11. What is your assessment of the judges’ impartiality in conducting oral proceedings?

Not at all impartial

Not very impartial

Fairly impartial

Completely impartial

r1

r2

r3

r4

12. What is your assessment of the speed at which your case was dealt with by the court?

Too slow

Slow

Normal

Fast

Very fast

r1

r2

r3

r4

r5

13. Without taking into account lawyer’s fees, what is your assessment of the costs of access to justice?

Costs very low

Costs low

Costs average

High costs

Very high costs

r1

r2

r3

r4

r5

14. Based on your experience, what is your assessment of the resources available to the courts?

Very insufficient

Insufficient

Sufficient

Broadly sufficient

r1

r2

r3

r4

15. In general, how do you assess the possibility of finding out about one’s rights?

Very difficult

Quite difficult

Fairly easy

Very easy

r1

r2

r3

r4


Personal data

16. Did you use legal protection insurance? 

r1 yes

r2 no

17. Did you receive legal aid?

r1 yes

r2 no

18. Have you already been in contact with another court than the court in…………………………………..?

r1 yes, (specify which) …………………………………………………………………………………………………….

r2 non

19. Gender

r1 Male

r2 Female

20. Age

r1 18-30

r2 31-50

r2 51-65

r2 66 and over

21. Do you have any remarks or suggestions to make in connection with the operation of the court in ……………………………………………… and the justice system more generally?

MODEL QUESTIONNAIRE FOR LAWYERS

Note to local survey managers. The questionnaire intended for lawyers should if possible be emailed to all members of the Bar association.

ASSESSMENT OF THE OPERATION OF THE COURTHOUSE IN …………………………………………..

BY LAWYERS OF THE BAR ASSOCIATION OF …………………………………………………………………………………………………….

Your opinions and suggestions are important to us and we would be grateful if you would take a little time to reply to the questions below. The questionnaire is anonymous and we guarantee that your replies will be dealt with in the strictest confidence.

Please tick the appropriate boxes

Assess the importance you attach to the following elements

1.   General services

Not important

Not very important

Average importance

Important

Very important

No reply

1.1 Co-ordination in setting hearing times

r1

r2

r3

r4

r5

r6

1.2 Access to the case-law of the courts of the judicial area

r1

r2

r3

r4

r5

r6

1.3 Communication between the court and lawyers

r1

r2

r3

r4

r5

r6

1.4 Clarity in terms of organisation and administrative responsibilities

r1

r2

r3

r4

r5

r6

1.5 Quality of the court’s website

r1

r2

r3

r4

r5

r6

1.6 Signposting in the court building

r1

r2

r3

r4

r5

r6


For the next questions, please only choose the court or service with which you have had the most contact (legal aid office, Family Division, juvenile court, criminal hearings department, etc.).

2.   Relations with the court or service

Not important

Not very important

Average importance

Important

Very important

No reply

2.1 Attitude and courtesy of judges and prosecutors

r1

r2

r3

r4

r5

r6

2.2 Attitude and courtesy of court officers

r1

r2

r3

r4

r5

r6

2.3 Judges’/prosecutors’ professional competence

r1

r2

r3

r4

r5

r6

2.4 Court officers’ professional competence

r1

r2

r3

r4

r5

r6

2.5 Judges’/prosecutors’ approachability and availability

r1

r2

r3

r4

r5

r6

2.6 Court officers’ approachability and availability

r1

r2

r3

r4

r5

r6

2.7 Speed of replies to requests

r1

r2

r3

r4

r5

r6

2.8 Quality and reliability of registry’s responses

r1

r2

r3

r4

r5

r6

2.9 Computerised management of proceedings

r1

r2

r3

r4

r5

r6

2.10 Ease of file consultation

r1

r2

r3

r4

r5

r6

2.11 Clarity of responsibilities and organisation

r1

r2

r3

r4

r5

r6

2.12 Costs of/fees for access to justice

r1

r2

r3

r4

r5

r6


3.   Preparation and conduct of hearings

Not important

Not very important

Average importance

Important

Very important

No reply

3.1 Conditions of meetings with clients

r1

r2

r3

r4

r5

r6

3.2 Furnishing and equipment of the courtroom

r1

r2

r3

r4

r5

r6

3.3 Punctuality of hearings

r1

r2

r3

r4

r5

r6

3.4 Organisation and conduct of hearings

r1

r2

r3

r4

r5

r6

4.   Judges’ decisions

Not important

Not very important

Average importance

Important

Very important

No reply

4. Clear and comprehensible judgments

r1

r2

r3

r4

r5

r6

4.2 Rapid handling of cases

r1

r2

r3

r4

r5

r6

4.3 Decisions easy to enforce

r1

r2

r3

r4

r5

r6


Assess your degree of satisfaction with regard to the following elements

5.   General services

Not satisfied

Not very satisfied

Average satisfaction

Satisfied

Very satisfied

No reply

5.1 Co-ordination in setting the times of hearings

r1

r2

r3

r4

r5

r6

5.2 Access to the case-law of the courts of the judicial area

r1

r2

r3

r4

r5

r6

5.3 Communication between the court and lawyers

r1

r2

r3

r4

r5

r6

5.4 Clarity in terms of organisation and administrative responsibilities

r1

r2

r3

r4

r5

r6

5.5 Quality of the court’s website

r1

r2

r3

r4

r5

r6

5.6 Signposting in the court building

r1

r2

r3

r4

r5

r6

6.   Relations with the court or service

Not satisfied

Not very satisfied

Average satisfaction

Satisfied

Very satisfied

No reply

6.1 Attitude and courtesy of judges and prosecutors

r1

r2

r3

r4

r5

r6

6.2 Attitude and courtesy of court officers

r1

r2

r3

r4

r5

r6

6.3 Judges’/prosecutors’ professional competence

r1

r2

r3

r4

r5

r6

6.4 Court officers’ professional competence

r1

r2

r3

r4

r5

r6

6.5 Judges’/prosecutors’ approachability and availability

r1

r2

r3

r4

r5

r6

6.6 Court officers’ approachability and availability

r1

r2

r3

r4

r5

r6

6.7 Speed of replies to requests

r1

r2

r3

r4

r5

r6

6.8 Quality and reliability of registry’s responses

r1

r2

r3

r4

r5

r6

6.9 Computerised management of proceedings

r1

r2

r3

r4

r5

r6

6.10 Ease of file consultation

r1

r2

r3

r4

r5

r6

6.11 Clarity of responsibilities and organisation

r1

r2

r3

r4

r5

r6

6.12 Costs of/fees for access to justice

r1

r2

r3

r4

r5

r6

7.   Preparation and conduct of hearings

Not satisfied

Not very satisfied

Average satisfaction

Satisfied

Very satisfied

No reply

7.1 Conditions of meetings with clients

r1

r2

r3

r4

r5

r6

7. Courtroom furnishing and equipment

r1

r2

r3

r4

r5

r6

7.3 Punctuality of hearings

r1

r2

r3

r4

r5

r6

7.4 Organisation and conduct of hearings

r1

r2

r3

r4

r5

r6

8.   Judges’ decisions

Not satisfied

Not very satisfied

Average satisfaction

Satisfied

Very satisfied

No reply

8.1 Clear and comprehensible judgments

r1

r2

r3

r4

r5

r6

8.2 Rapid handling of cases

r1

r2

r3

r4

r5

r6

8.3 Decisions easy to enforce

r1

r2

r3

r4

r5

r6


 [Optional questions:]

9. In general terms, what is your assessment of the operation of the court (service)?

Too opaque

opaque

Clear

Very clear

r1

r2

r3

r4

10. What is your assessment of the judges’ impartiality in conducting oral proceedings?

Not at all impartial

Not very impartial

Fairly impartial

Completely impartial

r1

r2

r3

r4

11. What is your assessment of the judges’ independence?

Not at all independent

Not very independent

Fairly independent

Completely independent

r1

r2

r3

r4

12. In your opinion, how has the operation of the court (service) changed over the last five years?

Has considerably deteriorated

Has deteriorated

Has not changed

Has improved

Has considerably improved

r1

r2

r3

r4

r5

13. What is your assessment of any changes in the court’s workload during this period (five years),?

r1 The workload has increased faster than the resources available

r2 The workload has increased in proportion to the resources available

r2 The resources available have increased faster than the workload

14. In your opinion, are the material resources available to the court?

Very insufficient

Insufficient

Sufficient

Broadly sufficient

r1

r2

r3

r4

15. . In your opinion, are the human resources available to the court?

Very insufficient

Insufficient

Sufficient

Broadly sufficient

r1

r2

r3

r4


Personal data

16. How many years have you been a member of the Bar in …………….?

r1 Less than 5 years

r2 5-10 years

r2 11-20 years

r2 More than 20 years

17. How do you exercise your profession as a lawyer?

r1 Alone

r2 As a member of a group (company)

18. Gender

r1 Male

r2 Female

19. Age

r1 Under 30

r2 31-50

r2 51-65

r2 66 and over

20. Do you have any remarks or suggestions to make in connection with the operation of the court and, more generally, the justice system?



[1] Rafal Pelc, “What are the expectations and the needs of justice users: the experience of the Polish Ombudsman”, CEPEJ study session, 2003

[2] Marie-Luce Cavrois, Hubert Dalle, Jean-Paul Jean (eds.), La qualité de la justice, coll. Perspectives sur la Justice, Paris: La documentation Française, 2002, 269 p.

[3] The courts of Roanne, Créteil and Albertville and the Appeal Court in Caen. Based on this experience, in 2008 the Ecole nationale de la magistratrure drew up an "Intervision" charter and an observation form so as to clarify the context and method.

[4] See for instance the following link: LimeSurvey https://demo.limesurvey.org/index.php?r=admin/authentication/sa/login