14 Results and Services Plans and Budget Reform in New South Wales

Marc Robinson
Published Date:
October 2007
  • ShareShare
Show Summary Details
John Pierce and Michael Di Francesco 

Getting the incentive structures “right” in public sector institutions is, to borrow a metaphor, a race with an ever-receding finish line. Performance budgeting, which can be defined as funding processes that use performance information to strengthen the link between resources and achievement, is one set of techniques to design these incentives, although the strategies vary significantly.

There is a large and growing menu of performance budgeting practices, and this volume aims to assess the options by asking three key questions: How effective are different types of performance budgeting as incentive structures? What implementation strategies were employed to effect behavioral change? If successful, to what extent are the critical success factors transferable?

This case study will describe the development and implementation of a distinctive form of performance budgeting adopted in the Australian State of New South Wales—a “funding plan” model that uses performance narrative as the basis for understanding the relationship between funding and results, and for discussing the continuing relevance of “baseline” funding. The chapter covers policy and organizational developments up to September 2005.

Using the performance budgeting taxonomy developed for this volume, the New South Wales model—known as the Results and Services Plan Budget process—corresponds with those categories of performance budgeting that aim for “looser” links between funding and results, one example of which is “program budgeting.” Its core objectives are to better align ministerial and bureaucratic priorities, and to deliver performance information about cost, service accomplishment, and results achievement that can inform allocation decisions.

The case study will profile some of the defining features of the New South Wales model and its implementation, including its focus on:

  • an annual budget process framed around a service funding plan—the Results and Services Plan—so that strategic discussion about incremental funding and policy issues occurs in the context of a “whole plan”

  • the use of “performance stories”—the results logic—to contextualize high-level discussion about the relative priority of funding issues and to clarify financial and non-financial performance expectations

  • a commitment to building agency capacity for both strategic planning and resource management through the results and services planning approach

  • the use of regular consultative review processes to test process modifications and work towards closer meshing between ministerial and bureaucratic priority setting and resource allocation decision-making.

Against the grain? Performance budgeting in New South Wales

Why is New South Wales of interest? During the late 1980s and early 1990s the Australian State of New South Wales was seen by many observers as a pioneer in public sector financial management reform. It was one of the first jurisdictions in the world to introduce full accrual accounting in the sub-national public sector, including consolidated financial statements, and in the commercial sector New South Wales implemented a systematic reform program that corporatized government businesses (mainly energy and transport utilities) to replicate commercial sector corporate governance and incentive structures (see Steering Committee on Government Trading Enterprises, 1988, for an overview of how the policy framework developed).

Since the mid to late 1990s, however, budget sector reform orthodoxy in Australia has revolved around accrual output budgeting, defined broadly as the purchase and monitoring of contractually specified outputs from service delivery agencies on a full accrual cost basis (see, for example, Robinson, 1999, 2002; Carlin and Guthrie, 2001). While in New South Wales the preparation of departmental budget estimates has been on a “program” basis since 1986, and rolling three-year expenditure limits (forward estimates) in place since 1989, these developments represented the farthest distance traveled on the performance budgeting road.1 As a consequence, according to some commentators, New South Wales has earned a reputation among the Australian states for being “somewhat recalcitrant” (see Carlin and Guthrie, 2003, p. 150). The criticism is undeserved, for three important reasons.

First, it disregards historical legacy. Sub-national (or State) government administration in Australia is fundamentally about the efficient delivery of essential services, and as such the priorities for organizational reform are often driven by the practicalities of ensuring that service provision meets the demands of local constituencies. Prominent students of Australian State political systems have observed that, by comparison with its State siblings, New South Wales’s political and bureaucratic culture has little inclination for radical reform and has more often than not been marked by a “climate of skepticism” about managerial change (see, for example, Painter, 1987; Halligan and Power, 1992; Alaba, 1994).

Second, it discounts intention. New South Wales Treasury has employed a deliberate strategy of “watching and learning,” as much to avoid costly errors as to learn lessons about which elements of systematic reform programs could be applied in this jurisdiction. New South Wales has questioned the practicability of implementing accrual output budgeting, chiefly on the basis that output purchase models adopted in other jurisdictions have yet to demonstrate clearly that they improve the way government works in practice.2

Third, it overlooks context. Between 1995 and 2005 the Council on the Cost and Quality of Government (CoCQoG) operated as a key advisory body to the New South Wales Government on the review and reporting of service delivery performance in the public sector. A key output of the Council were whole-of-government Service Efforts and Accomplishments (SEAs) reviews that summarized the activities and performance of budget dependent agencies on a policy sector basis (for example, the law, order, and public safety sector) (see CoCQoG, 2003, 2004). While the reviews were largely a consolidation of historical data—each year’s review reported sectoral performance on a rolling basis for the previous five years—the SEAs initiative did give the development of performance indicator standards a higher profile across the public sector.

Against this background then, the service funding model of performance budgeting that has emerged in New South Wales reflects a distinctly “pragmatic” response to national and international budget reform trends, and is a product—sometimes based on careful “trial and error”—of a strategy of lesson-drawing.

Expenditure budget process in New South Wales: key institutions

To aid understanding of the performance budgeting reform process, this section will introduce the State of New South Wales and provide an overview of the key institutions of government and their role in the current expenditure budget process.

Situated on the eastern seaboard of the Australian continent. New South Wales is the oldest, largest, and wealthiest of the six States (and two Territories) that comprise the Commonwealth of Australia.3 Its population of 6.7 million is highly urbanized and dominated by the mid-coastal conurbation centered on Sydney, Australia’s largest city, and bounded by the old industrial cities of Newcastle in the north and Wollongong in the south.

The New South Wales economy accounts for over one-third of national output and, while highly diversified, like other advanced economies is increasingly services-based. Sydney is a center for financial, insurance, and information and communications technology services in the Asia Pacific region, while the State economy retains significant strengths in agricultural production, mining and minerals processing, and manufacturing.

New South Wales has a Westminster form of responsible parliamentary government where traditionally the executive government has significant influence over taxation measures and the details of expenditure (see, for example, Parker, 1978; Smith, 2003). It does, however, also operate within the system of intergovernmental financial arrangements that mark Australian federalism. While sub-national government in Australia delivers many “everyday” functions like hospital, school, urban transport, policing, infrastructure, and recreational services, revenue-raising capacity lies principally with the federal government.4 The federal government’s use of specific purpose (or “tied”) funding mechanisms mean that State discretion over the nature of some key services is increasingly constrained, while the distribution of general purpose (or “untied”) grants is governed by a complex system that seeks to equalize States’ capacity to provide services, resulting in significant per capita transfers from larger to smaller States.5

As in other Westminster systems the annual expenditure budget process is an executive government responsibility. Budget preparation involves four key institutions: cabinet, the central agencies (including New South Wales Treasury—hereafter “the Treasury”—and the Cabinet Office), portfolio ministers, and service delivery agencies. The following discussion will profile the role of Cabinet and the central agencies.

Cabinet, which in New South Wales comprises the ministry,6 is the peak decision-making forum for executive government. Two of its standing committees have responsibility for resource allocation.

  • The Standing Committee on the Budget or Budget Committee (made up of the Premier and Treasurer, Deputy Premier and Minister for Transport, the Minister for Finance, and the Assistant Treasurer) sets the budget strategy (including major budget targets); approves updated forward estimates; reviews recommendations from the Treasury on portfolio funding proposals and options for portfolio savings; and determines final budget allocations and savings requirements for ministers and portfolio agencies.

  • The Standing Committee on Infrastructure and Planning (which is also chaired by the Premier) sets strategic direction for urban and regional infrastructure development—for example, the Metropolitan Strategy for managing urban development in and around Sydney. It also reviews infrastructure proposals, and makes recommendations to Budget Committee for infrastructure financing decisions.

The executive government’s key public service advisor in the budget process is the Treasury.7 The Treasury supports the Budget Committee by maintaining the three-year forward estimates system, advising on budget aggregates (or “the budget result”), advising on the merits of individual portfolio funding proposals, and advising on infrastructure proposals and individual agency capital programs. The Cabinet Office, which is the Premier’s principal source of advice on whole-of-government priorities, works with the Treasury to review the consistency of current and proposed funding with government priorities.

In the early 1990s the New South Wales Budget system was described as “multi-year target budgeting with discretionary (global) estimates in a program framework” (Nicholls, 1991, pp. 182, 190) and, by and large, this is still the case. In other words, estimates are prepared for the budget and three forward years on a “no policy change” basis; there are elements of “target budgeting” in that the forward estimates act as a baseline (or target) for agency expenditure; there are elements of conventional “bid and review” budgeting in which ministers and agencies are invited to submit program-based funding proposals;8 and ministers, to whom allocations are made, have limited discretion to transfer recurrent funds between programs within their portfolio.

Budget authorization requires approval by parliament. In May each year the executive government presents its budget to parliament, setting out the estimates of expenditure for the financial year commencing in July.9 In New South Wales while estimates of agency expenses are prepared and reported on an accrual basis, parliament appropriates from the Consolidated Fund the cash funding required to fund total expenses in the year.10 This is referred to as a “net appropriation” since service delivery agencies are able to retain revenue from their own sources (such as user charges) such that the cash appropriation approximates the “net cost of services”; that is, the difference between total expenses, retained revenue and the gain/loss on disposal of non-current assets.11

The final institutional setting informing the expenditure budget process is legislative “fiscal rules.” Between 1995 and 2005, governments in New South Wales were required to consider the fiscal policy framework set out in the General Government Debt Elimination Act 1995.12 The principal objective of the legislation was to “maintain fiscal results that are fiscally sustainable in the medium and long term.”13 The Act set out specific fiscal strategy “targets”—for example, in the medium term, “to reduce general government net debt to sustainable levels by 30 June 2005”—as well as fiscal principles to guide budget management—for example, that “the budget should be framed to achieve a fiscal result for the general government sector consistent with the fiscal targets.” The budget targets set by Budget Committee in the early stages of each year’s expenditure budget process were determined with reference to these targets and principles. Following the achievement of the Act’s short- and medium-term targets in 2005, the key fiscal targets and principles have been updated to focus on controlling the level of general government liabilities and constraining long run average growth in expenditure.14

Having profiled the key institutions in the expenditure budget process, the three sections that follow will describe the development and implementation of the service funding model of performance budgeting that has emerged in New South Wales.

Starting point: service and resource allocation agreements (stage 1 reform)

The starting point for recent budget reform in New South Wales was the issue of the Financial Management Framework in December 2000 (NSW Treasury, 2000a). A distinctly pragmatic approach to improving financial management and budgeting, the Framework is a “consolidation” of the periodic modifications made over the previous 15 years (and discussed briefly in this chapter). It is aimed squarely at delivering “value for money” by building incentives for government agencies to assist “resource allocation” and improve “resource management” (which equate broadly to allocative and productive efficiency, the key concepts used in this volume).15 While these objectives are consistent with reform agendas in comparable jurisdictions, the means of achieving them have been different.

The Framework noted the widespread adoption in Australia of “output budgeting” (the system of “accrual output budgeting” referred to earlier) and expressed concerns about its “complexity” and “practicability of implementation” (NSW Treasury, 2000a, p. 13). Output budgeting hinges on the capacity of central decision-makers in executive government to specify desired outputs in detail (and hence surmount its “relative lack of knowledge about program and service delivery options”) as well as a preparedness to involve itself in the detail of service delivery. Often, despite the best intentions, the reality of government administration does not allow for this. In addition, experience in comparable jurisdictions suggested that contractual purchase and monitoring of “output” potentially can create perverse incentives, including incentives to focus on what is done rather than what is achieved.

As a consequence, New South Wales set a different course towards performance budgeting based on funding a service delivery plan. Under this approach, the executive government sets high-level desired outcomes and associated performance targets, with program and service delivery design and implementation devolved to agencies. Each service delivery agency is then accountable to the executive government for the efficient implementation of the plan as well as for demonstrating progress towards achieving the desired outcomes.

The platform for this performance budgeting approach was the Service and Resource Allocation Agreement (SRAA). The SRAA was a service delivery agreement negotiated with the Treasury where agencies “describe the way in which the funding allocated to them is used, accompanied by efficiency and effectiveness data” (NSW Treasury, 2000a, p. 9). It aimed to integrate characteristics of a funding agreement (linking an agency’s budget funding to its service delivery), a performance agreement (linking the service delivery to financial and non-financial performance targets), and a strategic plan (linking agency planning to emerging community need and changing executive government priorities).

SRAA implementation applied an “output and outcomes” methodology. The agreements were framed around detailed “outcome statements” prepared for each of the desired outcomes that an agency sought to achieve. However, the pilot format for the SRAA, introduced in 1999-2000, focused implementation on full coverage of agency outputs, and encouraged agencies to develop performance targets for output delivery as a preliminary basis for monitoring financial performance against quarterly budgets.16 While the SRAA was intended to link funding to measurable outcomes, the pilot experience quickly found that the agreements were framed around outputs and output-related performance information (and specifically a preponderance of quantity measures, for example, the number of tasks performed). This indicated that agency (and Treasury) understanding of the relationship between outputs and outcomes was at a rudimentary level.

A key development, introduced for the 2001-02 budget cycle, was to supplement the output and financial compliance focus with broader organizational performance measures and indicators. The “balanced scorecard” method for mapping and measuring non-financial (customer, business, and capability) perspectives on organizational performance was the preferred framework (NSW Treasury, 2000b, pp. 9-13; for more information about the “balanced scorecard” see Kaplan and Norton, 1992).

The 2001-02 budget process also clarified the key objectives of the SRAA process as supporting strategic discussion between the Treasurer and portfolio ministers (and between the Treasury and agencies) about issues that could be addressed in the budget process, and providing a common framework for assessing agency performance. To meet these objectives, early in the budget cycle agencies were asked to submit draft SRAA outcome statements prepared on a “no policy change” basis along with agency funding proposals. In other words, it was intended that the draft SRAA and SRAA budget proposals comprise an agency’s budget proposal documentation.17 At the end of budget preparation, the final SRAA would be updated to reflect decisions of Budget Committee and signed off by both the Treasurer and the portfolio minister as an agreement.

For the pilot agencies, the SRAA budget process was used during the 2001-02, 2002-03, and 2003-04 budget cycles. Each year’s experience was subject to a comprehensive review, and a number of implementation issues appeared regularly.18 While there was a general recognition that the SRAA sharpened the focus on “what the Government wants,” and signaled a consistent approach to understanding agency issues, the agreement process suffered from a number of flaws. The SRAA’s aspirations to “comprehensiveness” meant that each agency’s agreement was lengthy and unwieldy—agencies were encouraged to list all relevant indicators rather than concentrating on those actually used by agencies to manage program delivery. This was compounded by an agency perception that the Treasury did not use the SRAA for either agency discussions or budget monitoring, and that the SRAA—as a planning exercise associated strongly with the Treasury—had marginal relevance for ministers. Evidence for this came from its absence in bilateral discussions between the Treasurer and portfolio ministers about funding proposals.

This perception was mirrored in internal Treasury feedback. To Treasury analysts (that is, budget officers) the SRAA seemed to be an “add-on” to the budget process and its credibility was undermined when stakeholders did not see it referred to directly by ministers or executives. As a consequence there was a tendency for Treasury analysts to see negotiation of the agreement with their agencies as a task that added negligible value to their relationship management role. In short, the SRAA was seen as a “separate exercise” and hence disconnected from the budget process.

The links between resource allocation and resource management also remained at a delicate stage for agencies. The SRAA was (generally) prepared at some distance from more established internal agency planning processes. At the same time, suspicions were raised about the Treasury’s intentions for the SRAA budget process. For example, some agencies expressed concerns about the prospect of Treasury analysts linking short-term fluctuations in performance targets with specific discussion about current and future funding levels. These agencies wanted the Treasury to give more emphasis in the SRAA process to understanding what influences the relationship between an agency’s activities and the government’s long-term outcomes, and how this should be gauged.

Against this background the 2003-04 budget process proved to be a transition stage in the development of a service funding model. In a pre-election period19 the government established the ad hoc Major Issues and Strategies Committee (MISC) of Cabinet to steer priority-setting. Operation of the MISC confirmed that the SRAA had a low profile in high-level officer consideration of funding and service delivery priorities. In effect, the 2003-04 budget process crystalized Treasury thinking about how to integrate the agreements with cabinet processes. In particular, the weight of both ministerial and agency feedback hastened the adoption of a more intuitive approach to understanding and measuring agency performance—this took the form of “performance stories” that could explain the contribution that agency activities made to desired outcomes and help guide the selection of performance indicators.

Linking planning and budgeting: results and services plans (stage 2 reform)

After the March 2003 State election the government announced a renewed commitment to improving public service delivery within existing allocations and reprioritizing current programs and activities to fund new initiatives.20 There was a growing realization in government that a combination of temporary stresses and structural dynamics were emerging as a risk to achievement of the fiscal strategy. The former included an expected decline in the rate of growth in New South Wales’s most important own source revenue stream—transfer duties on property sales—while the latter principally comprised a smaller proportion for New South Wales of federal (Goods and Services Tax) revenue (discussed briefly in this chapter). Better management of these risks required both aggregate expenditure restraint and reinforcement of budgeting tools for identifying and implementing priorities.

The configuration of cabinet structures—and the signals these send throughout the public sector—is one of the first tools available to any executive government. Service delivery improvement and expenditure restraint were embedded through the creation by cabinet of a new standing committee, the Service Provision and Financial Management (SP&FM) Committee. It is important to note that this committee’s resource monitoring function was intended to complement the Budget Committee’s resource allocation role. The SP&FM Committee was to maintain the spotlight on high-profile service delivery portfolios and monitor both budget compliance and, particularly for funded initiatives commencing in the budget year, service delivery performance against agreed indicators and targets.

The Treasury also recognized that the service delivery focus required an overhaul of the budget process to integrate it with new cabinet structures—hence the 2004-05 budget process was a significant departure from the traditional budget cycle. The pre-election MISC experience confirmed that the conventional “bid and review” process could be made both more efficient and more strategic (acknowledging the potential for improved collective priority-setting mechanisms early in the process).21 In effect, the existing process asked, “What additional budget allocation do you want?” rather than “How well are you using your existing budget allocation?” and “How can you improve service delivery?”

The response was a two-stage budget process. The first “scene-setting” stage was framed by dedicated meetings of the Budget Committee (and an advisory group of high-level officers) to discuss emerging policy issues and to determine broad priority-spending areas.22 These meetings were informed by a series of chief executive-level meetings designed to discuss funding and service delivery issues that might need to be addressed in the budget process.

While all ministers and agencies were invited to submit maintenance of effort funding proposals, the outcomes of the scene-setting meetings formed the basis for a second “invitation” stage where ministers and agencies were selected to submit enhancement of effort funding proposals. The process would be anchored by a common information base for assessing services and their cost, the Results and Services Plan (RSP).

The RSP replaced the SRAA at the beginning of the 2004-05 budget process and extended service delivery and funding plans to all budget dependent agencies.23 The RSP developed initially as a “summary” of the SRAA (and pilot SRAA agencies were grandfathered into the new arrangements by preparing their RSP as an overview to their current year agreement). While the RSP reflects the intent of the SRAA, it addresses some of the performance budgeting issues raised in the SRAA pilot in different ways.

The RSP is a high-level business plan that sets out the services that can be delivered and the results that can be achieved with an agency’s current budget allocation. Learning from the SRAA experience, the foundation of an RSP is a clearer description of the relationship between an agency’s activities and the outcomes it expects to achieve in the community, economy, or environment. Learning from the experience of comparable jurisdictions, the RSP uses “intervention logic” (or what is referred to in New South Wales as “results logic”) to develop and convey performance narrative information (NSW Treasury, 2004a; for more information about intervention logic, see Schacter, 2002; Baehler, 2003).24

The structure and content of an RSP are set out in Figure 14.1. An RSP is a short, plain-English planning document—a maximum of only ten pages, compared with the 30-plus pages of an SRAA—that captures the salient information being generated by an agency’s more detailed internal planning processes. As this suggests, it is important that the RSP be a “by-product” of agencies’ strategic and operational planning; it also indicates that the plan’s brevity is designed to give ministers and high-level officers a “snapshot” of priorities and resource deployment.25

Figure 14.1.Structure and content of the Results and Services Plan

As Figure 14.1 explains, an RSP consists of eight parts that capture performance narrative, and funding issues relating to, for instance, strategic scanning, efficiency improvement, organizational capability planning, and risk assessment and management. The foundation, however, is the results logic—or “performance story”—diagram that appears in Part 1 of an RSP. Results logic is a form of business strategy mapping that is both a method for guiding selection of a core set of performance indicators and a context for explaining the impact of funding and other issues on each part of the plan; that is, how planned service delivery can be managed with variations to the plan.

There is then merit in describing results logic methodology in more detail. Figure 14.2 is extracted from Treasury guidance material and shows the one-page results logic for a fictional law enforcement agency. Southland Department of Law Enforcement (see NSW Treasury, 2004b). This results logic is used by Southland to explain its capacity to influence results and to justify its selection of performance measures.

Figure 14.2.2006–07 Results logic (results hierarchy)

a 06–07 budget figures are in $ 000s.

The level of influence is explained through intermediate results, which are the more concrete objectives that Southland needs to achieve in the shorter term in order to contribute to results in the longer term. For example, one desirable long-term result is “fewer violent crimes and crimes against property.”26 The intermediate results should demonstrate plausible “cause and effect” linkages between services and this result; for instance, if community education services (such as community awareness campaigns about crime prevention) are delivered as expected, then Southland can expect increased community awareness of strategies for personal safety and property protection, and if this impact is achieved, then the agency can be reasonably confident that more people will take action to protect themselves and their property. While Southland will have some influence over intermediate results, it should have full control over how efficiently it delivers services.

In this way the results logic can be used by Southland as a high-level context for explaining its capacity to influence long-term results and hence clarifying with the Treasury the performance expectations against which it can be held accountable. Stepping down through the results logic shows how accountability for “performance” will match the capacity to influence. This is also illustrated in Figures 14.3 and 14.4, which are also extracted from Treasury guidance material. Figure 14.3 shows a small selection of agreed result indicators that correspond with the intermediate results identified in the results logic; Figure 14.4 sets out a manageable number of service measures that capture key attributes of service accomplishment for each service group, for example, efficiency, quality, and timeliness. For both sets of performance information estimates are provided for the budget year and three forward years. The service measures provide a basis for Treasury monitoring of planned service delivery performance over the short term; result indicators can inform ministerial discussion of relative funding priorities over the medium and longer term.

Figure 14.3.Result indicators

Figure 14.4.Service measures

The 2004-05 budget process was the beginning of a staged implementation of a more strategic resource allocation process. Certainly, in terms of workflow management, the invitation process worked efficiently, with the number of enhancement of effort proposals declining from hundreds to a few dozen, based on the spending priorities identified by Budget Committee. As a tool for strategic discussion the RSP had mixed success. Agencies were given a tight timeframe to prepare the plans, and the average quality reflected this. The intended role of the chief executive meetings—as the main conduit for agencies to raise emerging funding risks—was blunted by inconsistent implementation by central agencies. As some agencies later explained, the purpose of the meetings was heavily dependent on an agency’s relationship with its Treasury analyst.

In March 2004 the Treasury conducted a comprehensive post-implementation review where three issues loomed large:

  • the need for the Treasury to provide more detailed guidance and support to agencies on RSP preparation

  • the potential for better collaboration and information sharing between agencies on RSP development (including the incorporation of “cross-agency” issues)

  • tighter integration with agency corporate planning processes and the need for agency senior executive engagement with the process as a priority-setting exercise.

The process for the 2005-06 budget was modified to reflect these concerns. The initial step was consolidating the development of the RSP as an “agency” business plan. The Treasury’s communications strategy reinforced the underlying rationale for the new process—the RSP was about better planning, and the key to better planning was the results logic method. A clear message from the better-practice agencies was that significant benefits for performance and financial management could be realized by integrating the RSP with existing agency planning processes (for example, strategic, corporate, business, resource, and risk management planning).27 Hence a secondary focus for Treasury implementation was “selling” the message that the RSP should be a by-product of internal agency planning processes and not a stand-alone exercise. This would take time—the introduction of the RSP for the 2004-05 budget process had not been timed to feed into these annual planning processes.

These RSP development initiatives accompanied further modification of the new budget process. The two-stage invitation process was retained and the “scene-setting” phase was enhanced to enable an early exchange of information about service delivery issues and/or risks that could be addressed in the budget and forward years. For the first time, however, the scene-setting meetings would “roundtable” with portfolio ministers on a “cluster” basis.28 Meeting on a cluster basis was intended to promote broad discussion about the quantum and relative priority of ministers’ spending intentions in the context of “sectoral” spending areas. It was also designed to examine how funding pressures and opportunities for joint service delivery initiatives—for example, integrated client management in human services—could be addressed on a cross-agency or cross-portfolio basis. The outcomes of the cluster-based scene-setting phase would assist the Budget Committee in determining maintenance of effort funding and invitations for enhancement of effort proposals.

The cluster-based ministerial scene-setting meetings were informed by trilateral chief executive level meetings between agencies, the Treasury, and the Cabinet Office. Both sets of meetings were tasked with answering key questions about strategic and emerging issues and key organizational capability issues in the context of an agency’s RSP. These issues were designed to expose funding and service delivery issues not currently captured in the forward estimates, and medium-term risks to the sustainability of service delivery, including how the agency proposed to manage these risks. In this way the RSP was “built in” to the scene-setting phase.

The relationship between resource allocation (the budget process) and resource monitoring (a function of the SP&FM Committee of Cabinet, discussed earlier) also become more established around the RSP. The service delivery and budget compliance focus initiated after the March 2003 election continued through the 2005-06 budget process, and was transformed into a more searching “expenditure review” process designed to determine the continuing relevance of current activities. For those ministers and agencies appearing before the committee, the RSP was the first port of call for explaining the current services profile and allocation of resources. The RSP provided a “window” on the quality of agency planning and the activities that could be targeted for more detailed examination.

Finally, in conjunction with modifications to both the RSP and the budget process, the Treasury began to address its own organizational capability issues. In July 2004 an internal initiative to better define the role of the Treasury analyst (or budget officer) began; its key objective was to align expanding expectations about performance budgeting with the wherewithal to deliver advice on agency financial and non-financial performance. Taking the new RSP budget process as its centerpiece, the initiative examined how “performance analysis” capacity could be strengthened, for example, using team construction to get the “skill set” right, re-emphasizing the “relationship management” aspect of the analyst’s role, and, within the contours of often pressing work flows, encouraging a more strategic approach to assessing and managing funding issues.

A more strategic budget process: using the results and services plan as a funding plan (stage 3 reform)

The current 2006-07 budget process, which commenced in September 2005, is the next stage of reform. The first two iterations of the Results and Services Plan budget process acknowledged that the quality of the RSP was important to generating “demand” for performance information, and so the focus has been on raising awareness and building planning capability among agencies and Treasury analysts alike.29 The current stage aims to consolidate the use of the RSP as a tool for strategic discussion about funding and performance, but also commences use of the RSP as a “funding plan” within a proposed “performance management and budgeting” framework (set out in Figure 14.5).

Figure 14.5.A proposed “performance management and budgeting” framework

The previous section of this chapter discussed progress in establishing more effective planning structures and a supporting information base. The next steps involve using the RSP to build stronger links between each further element of the performance management cycle—funding, monitoring, and reporting. In broad terms the framework could develop along the following lines.

As an agreement between the Budget Committee and portfolio ministers, the RSP sets a small number of key performance indicators that ministers and agencies will report on and manage towards. These indicators will align chief executive performance agreements with agency strategic plans and business unit service delivery plans. Planned performance based on the results, services, estimates of expenses, and key performance indicators agreed in the RSP will be reported in the annual Budget Papers presented to parliament. Performance indicators are currently being negotiated as part of the budget process, and Budget Paper reporting is transitioning to a results and services format.

Over time the SP&FM Committee could hold portfolio ministers and agencies accountable for the delivery of the agreed level of service delivery performance in the Results and Services Plan. This will be achieved by regular monitoring of actual performance against that planned in the RSP. At the same time the SP&FM Committee will also monitor any efficiency targets required of agencies (including the impact on achievement as set out in the RSP). Initially this may concentrate effort to ensure that large portfolio ministers and agencies deliver services within budget.

Closing the performance management cycle is the reporting of actual performance. The performance information developed for the RSP could provide a core set of data to meet external reporting requirements. One option is to report actual financial and non-financial performance for the previous year in departmental Annual Reports.

Learning from the New South Wales experience

This chapter traces the development of the New South Wales Results and Services Plan Budget process over the last five years and describes the key features of the model. It introduced the New South Wales model as one of those performance budgeting systems that sought to establish “looser” links between funding and results. “Looser” linkages means an intention to build among decision-makers both a commitment to a small set of priority actions (and associated indicators of progress) as well as an awareness that selecting between competing priorities will always carry an opportunity cost. The descriptive case study has confirmed this: reform has focused on process modification that aims to embed high-level planning and performance information in the budget process.

The Results and Services Plan budget process remains in the early stages of implementation. However, four key features of the model stand out:

  • an annual budget process framed around a service funding plan—the RSP—so that strategic discussion about incremental funding and policy issues occurs in the context of a “whole plan”

  • the use of “performance stories”—the results logic—to contextualize high-level discussion about the relative priority of funding issues and to clarify financial and non-financial performance expectations

  • a commitment to building agency capacity for both strategic planning and resource management through the results and services planning approach

  • the use of regular consultative review processes to test process modifications and work towards closer meshing between ministerial and bureaucratic priority setting and resource allocation decision-making.

While the budget reform process is far from complete, there are some key lessons that can be drawn from the New South Wales experience in terms of both the approach to reform and the preliminary outcomes for performance budgeting.

The first lesson relates to the value of the “performance story” approach as a foundation for strategic discussion. The Results and Services Plan budget process is about creating a climate in which performance information forms unobtrusive background for discussion about funding and policy issues. Results logic uses simplified “cause and effect” narratives to help decision-makers understand the assumptions underlying service delivery strategies and to review how resources are currently allocated to these activities. While the RSP is also developing as a basis for matching funding to a small set of performance information, it is fundamentally directed at raising the level of central-line agency discussion in (and around) the budget process.

There is a growing list of examples of how the RSP is supporting better communication between the Treasury and agencies. Feedback from agency chief executives has confirmed that the RSP has been particularly useful in structuring high-level officer discussions, including debate about the continuing relevance of functions and activities. In a small number of cases the RSP process has helped restructured agencies identify functions (and associated resources) that do not fit within results logic and hence provided options for shifting activities from one agency to another. The RSP has also been employed as a platform for agreeing how longer-term issues and “merit” funding proposals (those which are assessed to be of worth but unable to be funded within existing constraints) can be managed without recourse to budget support.

This type of utilization is, however, not easy to achieve. It is clear that procedural integration (that is, mandating plans, changing templates, issuing timetables) is only ever the starting point. Substantive integration (that is, modified behaviors and attitudinal change) is dependent on making the procedural changes visible in the most basic, everyday ways. As in any organizational setting the urgent has a way of displacing the strategic, and senior management has recognized the importance of demonstrating a clear commitment to talking “performance” and signaling “ownership.”

A second, related lesson revolves around the capture and use of performance information; that is, information on the results achieved and the costs of achieving those results. As this volume notes, accessing and deploying this information does not come cheaply. So how did New South Wales try to reduce the costs of performance information?

Initially, the Results and Services Plan budget process aims to manage the “costs” of performance by dealing with the information asymmetry that characterizes the relationship between central and line agencies. By framing the planning process as an information-sharing exercise, the RSP was as much about increasing an agency’s understanding of its own business as providing a tool for expanding the Treasury analyst’s knowledge.

Again, learning from the SRAA trial, a second focus is addressing bounded rationality. Acknowledging the potential for decision-makers to experience information overload, a key concern in designing the RSP was to make performance information resonate with the intended users. Performance stories—presented in the form of a one-page “results logic” diagram—use plain English narrative and visual displays to give users a snapshot of an agency’s current service delivery priorities and the extent to which these are aligned with government priorities. The performance story, however, is only half the story. To build commitment, ministers and chief executives must own a small number of key performance indicators that make sense in the context of the performance story.

The RSP process sought also to manage problems associated with outcomes measurement. The RSP is as much a methodology for thinking about how and why services are delivered, as it is a discrete strategic planning exercise. The results logic sets out key strategies (how services are delivered) and identifies priorities that can be used to drive performance management within an organization. Further, it helps to “frame” central-line discussion about performance expectations because “cause and effect” chains mark out an agency’s capacity to influence results (and hence its “accountability” for results achievement).

A third lesson relates to the importance of managing expectations about budget reform and the pace at which it can progress. The Treasury considers that a hallmark of the Results and Services Plan budget process has been an implementation strategy based on “trial and error”—there has been a preparedness to test different approaches to planning and priority-setting and an acceptance that some of these initiatives would need to be modified. The development of budget reform has also reinforced the significance of ongoing review of agencies’ implementation and the need to work closely with those officers who are actually preparing the plans. Finally, a central budget agency should not underestimate the value of candor in the reform strategy and acknowledge up front that change processes will be long and grinding.


Performance budgeting in New South Wales is a product of both a distinctly pragmatic approach to public sector reform and an intentional strategy to “watch and learn” the experience of more radical reform in comparable jurisdictions. The Results and Services Plan budget process can be categorized as a “loose” performance budgeting system; that is, the process is designed primarily to give strategic context to discussion about relative funding priorities. While the change initiative is in the early stages of implementation. New South Wales Treasury remains committed to gradual, long-term budget reform and believes that significant progress has been made in linking planning and performance information with the way that resource allocation choices are considered.


The definition of a “program” in the New South Wales context remained at a high level and was applied inconsistently across service delivery agencies. In general, a program comprised a group of activities that contributed to a specific objective, for example, road maintenance tasks contribute to the development of road network infrastructure. Agency financial estimates are still presented on a “program” basis for the Budget year, but are reported at an aggregate level for the three forward years. For an example see 2005-06 Budget Paper 3 (NSW Treasury, 2005).

This type of studied neutrality has been applied by New South Wales in matters of public sector reform more generally. For example, in the late 1990s the Premier’s Department, which has responsibility for public sector performance improvement, noted similar concerns in its study of international experience with structural reforms separating policy and operational functions (NSW Premier’s Department, 1998).

Australia has a federal system of government. The national government in Canberra is variously described as the “Australian,” “Commonwealth,” or “federal” government.

In the 2005-06 budget 41 per cent of total budgeted revenues were sourced from the federal government (see NSW Treasury, 2005: Chapter 3 of 2005-06 Budget Paper 2).

The chief component of general purpose grants is revenue from the Goods and Services Tax (GST), a federal value added tax. The grants are distributed among the States on the basis of per capita relativities calculated by the independent Commonwealth Grants Commission. The key elements of the funding system, and donor states’ concerns about the way per capita relativities are calculated, are set out in Review of Commonwealth-State Funding (2002).

That is, all Ministers of the Crown, as distinct from a smaller grouping of senior ministers.

Established in 1824, New South Wales Treasury consists of two separate agencies: the Office of Financial Management (which advises on financial management policy and economic conditions) and the Office of State Revenue (which administers state taxation, collects revenue, and develops taxation policy and legislation). In this chapter, a reference to the Treasury means the Office of Financial Management.

Portfolio ministers and their agencies prepare two types of funding proposals—a maintenance of effort proposal (which seeks funding outside the forward estimates to maintain current service delivery levels) and an enhancement of effort proposal (which seeks funding outside the forward estimates to expand current services or introduce new services).

The financial (or budget) year runs from July 1 to June 30. The Appropriation Bill sets out estimates of payments for both recurrent services and capital works and services from the Consolidated Fund. An Appropriation Act must pass both Houses of Parliament—the Legislative Assembly (the lower house) and the Legislative Council (the upper house). However, where there is a disagreement between the Houses, the Constitution Act 1902 provides that the upper house has no power to prevent passage of appropriations for “ordinary annual services.” For more detailed discussion see Twomey (2004, pp. 530-82).

Cash allocations from the Consolidated Fund are appropriated to ministers and lapse at financial year’s end.

The “net cost of services” is the key administrative control Treasury uses to manage the impact of service delivery agency expenses on the budget aggregates in the budget and forward years.

The Act required that governments “aim to pursue” policy in accordance with the Act’s fiscal principles and targets (s10), but did not create enforceable obligations (s27). The Treasurer was, however, required to include in the annual budget papers a statement explaining the reasons for any departure from the principles (s10). In any event, the government has provided regular progress reports in Paper 2 of the Budget Papers (see, for example, NSW Treasury, 2005).

The Act defines “fiscal sustainability” as requiring “that the Government be able to manage financial shocks in future periods without having to introduce significant and economically or socially destabilising expenditure or revenue adjustments in those future periods.”

See Fiscal Responsibility Act 2005. For discussion about fiscal target achievement see Chapter 1 of 2005-06 Budget Paper 2 (NSW Treasury, 2005).

Value for money is defined as the requirement “that the limited resources available to Government be applied in such a way that community needs and expectations are satisfied to the greatest extent possible both now and in the future”; resource allocation is the “best mix of programs and services within the funding and expenditure constraints set by the State’s fiscal strategy”; and resource management is “selecting and managing the mix of labour, capital assets and other resources available in such a way that they deliver the greatest quantity and quality of programs and services possible” (NSW Treasury, 2000a, p. 6).

The trial agencies were the Attorney-General’s Department, the Office of State Revenue and the Departments of Agriculture and Urban Affairs and Planning. Later pilots would cover almost three-quarters of the General Government sector budget, including (in addition to the four trial agencies) the Departments of Health, Education and Training, Corrective Services and Land and Water Conservation, as well as the Roads and Traffic Authority.

Although the “request” to submit an SRAA was issued to agencies as part of the allocation letter (the formal advice from the Treasury about Budget and forward year estimates), the process was not mandated, for example in a Treasury Circular. Instead, the implementation strategy sought to create “incentives” for better planning by demonstrating the value of stronger linkages between strategic planning and the budget process. This approach continues with the Results and Services Plan budget process.

In each year the Treasury conducted a post-implementation review using a consistent data collection format, that is, participant interviews with agency chief executive officers, senior agency officers involved in the budget process. Treasury executive officers and Treasury analysts (budget officers).

The Constitution Act 1902 provides for a fixed four-year term for the parliament of New South Wales, that is, a general election is held every four years.

The initiative was framed as “better services for New South Wales” and reflected the priorities of a third-term government aware of the need to consolidate and deliver on its election commitments.

Internal Treasury analysis indicated that, on average, agencies submitted enhancement proposals valued at about 3 percent of the total budget and that only about 10 percent of these proposals were finally approved. In effect, in a typical budget cycle, the efforts of agencies and government are directed at funding that is worth less than 1 percent of total budget expenditure.

These priority review processes drew on similar arrangements operating in other jurisdictions, for example, the Senior Ministers Review process run by the Commonwealth Department of Finance and Administration.

At the start of the 2004-05 budget process SRAA implementation remained in pilot stage with 11 service delivery agencies.

There were a number of sources inspiring the development of results logic. In the early 1990s the NSW Premier’s Department pioneered the use of “outcomes hierarchy” methodology to support better program evaluation (NSW Office of Public Management, 1990). More recent models include the use of “logic chains” by the Oregon Progress Board in the United States, the application of “intervention logic” in the preparation of Statements of Intent in New Zealand, and the promotion of “performance stories” by the Office of the Auditor General of Canada.

The RSP also emphasizes the use of everyday language to make technical information more accessible to non-specialist audiences, both ministerial and bureaucratic (the most obvious example being the replacement of “outcomes and outputs” with “results and services”). The use of everyday language was extended to the key questions that helped agencies prepare an RSP: “What do you do?” “Why do you do it?” “How do you know you are doing a good job?”

Agencies are encouraged to describe results (and intermediate results) as “end-points to be aimed for.” The result or intermediate result should then be used to select an indicator that can capture whether a situation is changing or improving.

An illustrative better practice agency was the Department of Community Services (DoCS) which has responsibility for child protection services, parenting supports, and early intervention programs. During this period the DoCS Executive used the RSP approach to drive organizational reform and clarify service delivery priorities. This is illustrated in the integration of results logic methodology with the DoCS corporate plan (see DoCS 2004).

A “cluster” refers to a grouping of ministers (and their agencies) around shared functions or closely related policy objectives. For example, the “human services” cluster might comprise the Ministers for Health, Community Services, Aging Disability and Home Care, Education and Training, Aboriginal Affairs and Housing. While there were existing cabinet-level forums for some ministerial clusters, these arrangements were not aligned closely with the annual budget process.

In March 2005 the Treasury used “stages of development” criteria to assess agency RSPs submitted as part of the 2004-05 and 2005-06 budget processes. The findings indicate that there has been a steady improvement in the average quality of result logic development and performance information.


    Other Resources Citing This Publication