Reference Indicators for Public Policies in the EUPolicy options for indicators on public administration

The identification of indicators linked to the most used public services was indicated by the European Ministers of the Public Service and Administration, in their Resolution on Quality and Benchmarking of public services in the EU adopted at their 8th meeting in Strasbourg in November 2000, as an important step for the definition of a common European area for quality in public services. At their 35th meeting in Strasbourg in November 2000 the Directors General decided on a work programme for the Innovative Public Services Group (IPSG). The second point in that programme concerned promotion and dissemination of benchmarking and identification of quality indicators on a European scale. The work programme mentioned specifically: “Develop new studies enabling the identification of reference indicators in relation to services most used by citizens in their everyday lives.”

Italy launched an initiative in this area at the Strasbourg meeting and has since worked on the project. A progress report – Developing European Benchmarking Experiences: Italy’s proposal, note on work progress (24th of April 2001) – and a methodological note – Reference Indicators for Public Policies in the EU, a methodological note – is attached to this memo.(Italy, fill in.)

Previous activities in the IPSG

At the very beginning of the work, in what is now called IPSG, Austria contributed with a report on performance indicators, showing that there is a wealth of indicators in all member states.

The UK has chaired a Learning Lab under IPSG on service indicators – the ASKME project. The aim of the project was to study the possibility of comparing service delivery from the point of view of citizens’ accessibility. A report – Accessible Service and Knowledge management, a European benchmarking project – is published (at ), showing that it is possible to acquire relevant information on accessibility at a fairly low cost as long as you accept some margin of error. By accessibility is meant waiting time, number of places for and forms of information, applications and delivery of services etc. This information is deemed to be of interest on a policy level, however much too crude for citizens choice.

Uppsala: explore the role of indicators!

At the 36th meeting of the Director Generals on the 17-18th of May 2001 in Uppsala it was agreed in relation to the IPSG that “the group should serve as a focus for continued work on benchmarking activities including the Italian proposal to explore the role of service indicators.”

This memo addresses the issue what role indicators can have in the informal co-operation between the EU member states on public administration. It points out various policy options that are all related to the collection and dissemination of indicators. It also exemplifies what is already being done in the field of indicators.

For the future work of the IPSG on indicators it is necessary to focus sharply on what is beneficial for the development of public administrations across Europe. Focusing is necessary because the options are many and the possible number of indicators as many as the stars in the sky. The possible benefits of having access to information through indicators must be weighed against the costs of collecting the indicators, since that is costly and requires lots of hard work during a long time.

The paper is written as a menu with different options – to show the wide variety of roles that indicators may have. Just because an option is being mentioned doesn’t mean that the author of the paper favours it. It is for the Director Generals and ultimately for the Ministers to chose among the options. There is however a philosophy that is common for all options: the collection of indicators should be based on and preceded by some form of a program with a purpose linked to and decisive for the design of indicators.

Statistics

There is a large body of international statistics in various fields. Statistics range from number of combat units in armed forces to levels of pollution and number of people below the poverty line. Statistics production has been going on for many, many years and some of it is based on long traditions. Much statistics finds its ways into reports and is used in decision-making. Much is used for research. And some is not used at all.

At the same time there is a growing output of indicators. Are indicators something different from statistics? They are numbers like statistics. Statistics stem out of curiosity and concern. Indicators do that too, but in a more immediate and direct way. Indicators are – or ought to be – designed to answer very specific questions: “are emissions below the accepted level?”, “is the X-agency more efficient this year than last?”, “is the program being implemented on time?”

Statistics and indicators do not differ in kind but in origin. The growing output of indicators signals a growing demand for numbers for immediate decision purposes.

It is advisable to keep the general production of statistics apart from the use of indicators. Production of indicators must not serve as a substitute for the production of statistics. The production of international statistics relating to public administration is and shall be handled through other channels – UN, OECD, Eurostat – than the informal co-operation between EU member states in the field of public administration. However, Ministers and DGs of public administration may express their wishes on the production of statistics. Production of indicators shall be reserved for instances of “immediate use” in some way or another. Some of these “immediate uses” are exemplified in the following sections.

Policy option no 1: Develop a program for statistics on public administration to be channelled to the relevant international statistical office(s).

Observation

The first, most obvious, “immediate use” of indicators is for the purpose of observing an area, a part of society or a policy field. Several governments and international organisations have had indicators worked out for areas such as poverty, environment, health etc. Indicators are used to give a picture of the state of the area in order for decisions-makers to notice changes that may call for action.

One example of this usage are the reports by the Danish Ministry of Finance called “Structural Monitoring”. The aim is to give the Government – and the general public – a picture of Denmark’s welfare position relative to other countries and the most important determinants of that welfare.

Chart 2.1. The structural monitoring system.




Note: The chart – like the actual structural monitoring system – is not exhaustive, in the sense that not all causes of prosperity and welfare are included. For reasons of clarity, also, only the most important connecting lines have been drawn in.

Source: Structural Monitoring – International Benchmarking of Denmark, the Danish Government, 2000.

One area in the structural monitoring system that comes close to the general public administration area is, what is called “public-sector accessibility”. Under this heading the following aspects of accessibility are monitored for the following services:

  • The number of public authorities that the citizen or company has to contact to obtain a given service.
  • The length of time the citizen or company has to wait for delivery of that service.
  • Citizens' freedom of choice and hence citizens' and companies' scope for choosing those authorities or institutions whose accessibility – and other services – is most satisfying.
  • The use of information technology that enables citizens and companies to access information and, where relevant, communicate digitally with public authorities as well, instead of having to make enquiries by phone, post or in person.

These four accessibility parameters have been applied to the following 12 specific public services:

  • Issuing of passports
  • Acquisition of driving licences
  • Registration of real property
  • Civil actions at courts
  • Criminal proceedings at courts
  • Allocation of children's allowance
  • Approval of adoption
  • Handling of industrial injury cases
  • Settlement of personal income tax
  • Registration of companies subject to VAT
  • Hospital treatment
  • Approval of medical drugs

An example of the kind of information that is provided concerns children’s allowances.

Chart 20.9. Average case-handling time for initial allocation of children's allowance.


Note: The German authorities assess the average case-handling time for initial allocations as varying between 7 and 14 days. This assessment has been set as 11 days in the chart. In England the authorities have reported that their primary target is to complete 68% of first-time applications within 10 days and, secondarily, 94% of first-time applications within 30 days. In the chart this has been set as an average case-handling time of 17 days. In Japan the authorities assess an average case-handling time of 34 days. In the USA case-handling times are not registered, but under federal law rulings have to be made within 45 days. Based on this, the average case-handling time for the USA has been set as 34 days, as for Japan.

Source: Structural Monitoring – International Benchmarking of Denmark, the Danish Government, 2000.

Similar monitoring schemes have occasionally been worked out for the Netherlands, Finland and Sweden. Finland had a more focused approach, limiting the observation area to competitiveness.

Denmark has produced two of these reports. The other countries mentioned only one. It is not for sure that Denmark will continue producing this report. It is a quite costly exercise. However, the report has had some impact on decision-making in Denmark, which may call for a continuation. For one thing, it prompted an in-depth international benchmarking of education. (Denmark, fill in.) In Sweden the Danish report, that includes Sweden, as well as the Swedish report went unnoticed. In Finland and the Netherlands (Finland and the Netherlands, fill in.) these reports ……

The usage of reports such as these is very dependent on the context in which they are ordered. In Denmark it was a political decision to do it. Hence, there was political interest in the results and a readiness to act on the results. In Sweden the report was ordered merely out of the curiosity of a think tank. In Finland and the Netherlands (Finland and the Netherlands, fill in)……

Policy option 2:

Develop a program for observing the state and the development of public administrations in the EU 15 by means of a set of indicators.

Such a program can take many different shapes and have many different perspectives. Here are some examples of sub-options:

2.1 Citizens’ accessibility to various public services (waiting and serving times, service points, distribution channels, information etc.)

2.2 Coverage of public services provided in relation to citizens needs (enrolment, public pensions/private, unemployment benefit coverage etc.)

2.3 Efficiency of production (productivity, unit costs)

2.4 Correct handling (complaints handling, appeal system, corruption etc.)

2.5 Civil service (wages, employment laws, education, training, codes etc.)

2.6 Information technology (the use of computers, on-line services, IT-costs etc.)

2.7 Central, regional and local government responsibilities (what service at what level?)

2.8 Financing: percent taxes, fees, loans, contributions for various services, fairness of financing.

The list may easily be extended depending upon the particular interest of the Ministers.

Monitoring

The European Union uses a new open method of co-ordination. It is based on monitoring progress, by way of indicators, along the Community strategy for social and economic development. A Council meeting is to be held each spring to assess progress in relation to targets in several fields such as structural reforms of labour markets and other markets, social inclusion, research, small and medium-sized enterprises etc.

One of the strategic areas is E-Europe. It is sub-divided into the following areas:

  1. European youth into the Digital Age
  2. Cheaper Internet access

3. Accelerating E-Commerce

4. Fast Internet for researchers and students

5. Smart cards for secure electronic access

6. Risk capital for high-tech SMEs

7. eParticipation for the disabled

8. Healthcare online

9. Intelligent transport

  1. Government online

The tenth area – just like the others – is now being benchmarked. Benchmarking is based on a commonly decided set of 20 services to the citizens. The aim is to portray and monitor progress in delivering these services on-line. The general framework for the benchmarking has been drawn up by a working group. The actual task of measuring has been outsourced to a consultancy firm. It is to be performed on a continuing basis. The resulting information is to be used in the monitoring by the Council and the Commission.

The example illustrates several important points

1)that indicators may constitute an important ingredient in a policy and its implementation

2)that, consequently, there is an “owner” of the benchmarking exercise

3)that the specific indicators are shaped in such a way as to match the specific requirements of the policy and its implementation

4)that it can be clearly envisaged for what purpose the indicators are collected and the usage they will be put to

5)which in turn makes clear what the demands are on reliability and validity of the indicators

6)that the collection of indicators – of reasonable reliability and validity – is costly

A parallel, that could be relevant for the informal co-operation of the EU member states in the field of public administration, would be if Ministers of Public Administration would form a policy on public administration (or some aspect of it) and, as a vehicle of implementation, would want to monitor progress by the use of indicators. The policy – means and ends – would constitute the basis from which to distil the kind of indicators that would be needed, and also to conclude what level of precision would suffice. The indicators would be part of a policy and the benchmarking process would have an “owner”. Resources would be allotted to a work-group or some other designated organisation to do the job of collecting the indicators.

Policy option 3: To work out a common programme for (some specific part or aspect of) public administrations of the EU 15 and monitoring progress by indicators.

Citizens’ voice and choice

Quite a different purpose is that of allowing citizens to make more informed judgements and perhaps even choices among service providers on the basis of information on service indicators, e.g. indicators that tell citizens about differences in quality and service delivery and perhaps even costs. With this purpose in mind indicators should be collected and widely publicised to the citizens of Europe. Citizens would be expected to compare the indicators from different countries to form an opinion on the service standard in their own country. On the basis of such comparisons they would form their demands on the service provision. Comparisons of service levels would also assist citizens in choosing among service providers – in the first place on the national level, by giving citizens a benchmark – best practice – to go by and, in the second place, on an international level, if it is possible to choose among service providers from various countries. With this purpose in mind, indicators should be designed in such a way as to give citizens information about services relevant in a consumer perspective. The consumer perspective may be different from the policy perspective and therefore demand different indicators. Such things as the variance of delivery times and regional differences may be of much greater relevance to citizens than to policy makers. Also, the accuracy of the measurements will be of much greater importance than if indicators are used by policy-makers solely.

Policy option 4: Choose a specific area of public service where it is likely that information on service levels will arouse citizens to act for the development of those services.

Areas of service delivery of great importance to citizens are suitable for such a programme, such as health care, education, social security, police protection of life and property etc.

Harmonising public service levels

Going one step further could be to create a common European area for quality in the public services (or in a specific public service): with increased mobility, not only will citizens of the EU member states expect services of similar quality wherever they go within the EU but they may also expect to be treated as if the service was delivered by one integrated system. Harmonising service levels may, in fact, ease mobility. Harmonisation may mean attaining minimum levels of service and need not mean similar levels of service.

With a program for harmonisation indicators would be needed to monitor the progress in that direction.

General harmonisation is perhaps not desired but in specific areas harmonisation may be wanted, such as the handling of asylum seekers, the processing if custom duties and taxes, court proceedings and appeals.

Policy option 5: Develop a policy of harmonisation – or minimum levels – of service provision such as accessibility in specific public service areas across EU 15. Indicators matching the desired harmonisation are developed and compiled.

At this stage it is very premature to bring up such proposals, however, they should be mentioned as a possible future use in case they will be considered to be desirable.

Benchmarking public administration on a macro level

Regularly many countries compare economic performance. Growth rate of the economy as whole, of exports, inflation etc. are compared. The level of unemployment, investments, terms of trade etc. are assessed and compared with other countries, as well as over time. Without comparisons these numbers are meaningless. It is only when compared with other periods and countries that it is possible to analyse and understand cause and effect and to assess performance – how good is good?

This benchmarking requires enormous amounts of comparable statistics that has taken a long time to develop. Analytical methods have been refined successively and nowadays encompass complete econometric models of national economies. Questions that may be asked and sometimes answered are for example:

-is the rate of growth of GDP what should be expected given labour and capital inputs?

-is inflation higher or lower than expected given the level of unemployment?

Another example of benchmarking on a macro level more relevant to public administration is given by the World Health Report 2000. The report provides and effort to benchmark health care systems of 191 countries. It is done by a sophisticated method of comparisons. In short a number of aggregate indicators for outcomes of a health care system are compared with resource inputs. The health care indicators are