Performance Management in Dutch Central Government

Published date01 March 2004
Date01 March 2004
DOIhttp://doi.org/10.1177/0020852304041229
Subject MatterJournal Article
Performance management in Dutch central government
Nico P. Mol and Johan A.M. de Kruijf
Abstract
This article investigates how and to what extent performance indicators in Dutch
central government are actually embedded in performance management. In a case
study encompassing 12 government organizations, the relevance of the indicators
presented is analysed in three stages: (1) with respect to the responsibilities for
results intended in performance measurement, (2) with respect to responsibilities
actually implied in resource allocation and (3) with respect to responsibilities ulti-
mately to be inferred from governance — planning and control — systems applied. In
our research, management control systems appear to be only partially tuned to the
performance indicators specified in advance. The familiar expression ‘What you meas-
ure is what you get’ is thereby invalidated by all kinds of restrictions imposed on a
manager’s actual responsibility for measurement outcomes.
Introduction
As in many other countries, efforts to measure performance in Dutch government
organizations have increased substantially in the last decades. In central government,
the first attempts to measure performance systematically — specifically outputs —
originated from the Government Budgeting and Accounting Act 1976. In this Act, the
specification of performance indicators was prescribed ‘whenever feasible and rele-
vant’. Subsequently, performance measurement has gradually been developed into a
system of performance budgeting more or less implemented in the central govern-
ment’s 2002 budget.
Obviously these efforts may be attributed primarily to the deficiencies in the
budget mechanism already noted by the US Hoover Commission in 1949. As the
Nico P. Molis Professor of Public Financial Management at the University of Twente and at the Royal
Military Academy in The Netherlands. Johan A.M. de Kruijf is Assistant Professor of Public Finance
at the University of Twente. CDU: 65.012.3(492)
Copyright © 2004 IIAS, SAGE Publications (London, Thousand Oaks, CA and New Delhi)
Vol 70(1):33–50 [DOI:10.1177/0020852304041229]
International
Review of
Administrative
Sciences
02_RAS 70_1 articles 2/27/04 1:00 PM Page 33
results obtained are not immediately given by revenues representing monetary
valuations by consumers, they should be inferred from explicit measurement of the
goods and services produced and supplied to society. However, performance indica-
tors used in this measurement will specify volumes instead of values — their validity
in representing citizen’s valuations may thus be questioned. As a consequence, per-
formance paradoxes may arise implying contrary developments in the measured and
perceived performance (Van Thiel and Leeuw, 2002). Generally, ambiguities will arise
in the interpretation of indicator variances, thus providing opportunities to manipulate
‘results’ in the accounts presented. In view of differing interests — e.g. between (inter-
nal) managers and (external) auditors (Halachmi, 2002) — performance measurement
may thereby degenerate into ‘politics’.
Usually, the validity of the presented indicators is left undetermined. Their rele-
vance for decision-making is not explicitly specified but has to be deduced from their
actual use in performance management. Empirical research with respect to the use of
performance indicators, moreover, rarely provides unambiguous evidence on this
relevance. In most cases, the actual consequences of indicator variances for perform-
ance management — to be inferred from resource allocation — cannot be positively
assessed (e.g. Jordan and Hackbart, 1999; Wang, 2000; Pitsvada and LoStracco,
2002). Clearly, reporting on performance may become unreliable when the status of
the indicators in the governance of the organization remains undefined. Performance
indicators may be oriented towards outcomes, outputs, throughputs or inputs, each
requiring a different approach in controlling and reporting. The presented results may
provide a ‘true and fair’ view of performance with respect to any of these character-
istics but may also add up to window dressing — specifically, when input or through-
put indicators are presented as proxies in assessing outputs or outcomes. To maintain
accountability, the variables actually governing performance management should be
disclosed.
This article aims to contribute to such a disclosure by investigating the characteris-
tics of the government governance system in which performance indicators are
embedded. We presume that the status of the indicators specified in resource alloca-
tion may be deduced from the instruments applied in planning and control. If plan-
ning and control are not tuned to the variances of the indicators supposedly
monitoring performance, stakeholders should beware of their actual relevance.
In our research, we have analysed planning and control instruments for a number
of government organizations to judge the coherence of their governance system and
the significance to be attributed to the performance measures defined for them.
Underlying our empirical analysis, we have developed a theory with respect to the
normative properties of this coherence. Starting from the continuity principle sup-
posedly governing performance in business — positive results of revenues over costs
— we have tried to specify the necessary adjustments to this principle to make it
applicable to government organizations. Continuity requirements may be interpreted
thereby as responsibilities for results established in resource allocation to those
organizations. We then tried to identify the extent to which the instrumentation of
planning and control corresponds to the need to fulfil these responsibilities.
Deviations may be judged to create opportunities for discretionary behaviour — to the
detriment of the results formally endorsed as measures of performance. So our
34 International Review of Administrative Sciences 70(1)
02_RAS 70_1 articles 2/27/04 1:00 PM Page 34

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT