29th May 2015, Volume 128 Number 1415

Fiona Bronwen Doolan-Noble, Mataroria Lyndon, Sybil Hau, Andrew Hill, Jonathan Gray, Robin Gauld

Background

Since the delivery of the World Health Organization (WHO) 2000 report on health system performance improvement,1 there has been increasing international interest in health systems and their assessment.2 The WHO defined a health system as, “…all the activities whose primary purpose is to promote, restore or maintain health”. For New Zealand, and many other developed countries, there is a growing acceptance that health systems, as a whole, have to change to meet the changing healthcare needs of their populations, who are ageing and increasingly likely to be burdened with chronic conditions.3 To meet these changing demands there is an increasing focus on integration of services at all levels and across all sectors, including social services.4 Against this backdrop, policy makers and health care managers, therefore, are keen to determine how well their system is responding to changing health care needs in their area, but also how well their health system compares with others.

At the international level, organisations such as the WHO, the Organisation for Economic Cooperation and Development, and the Commonwealth Fund, have taken a lead in developing methods for comparing health systems.1,5-6 The work of such agencies is useful for national policy-makers in particular, for highlighting performance of their health systems at a relatively abstract level and for cross-national learning in key areas such as quality of care, expenditure and workforce. Beyond this level, the usefulness of such data are limited. Indeed, within a country it is difficult for national or regional health system stakeholders to obtain information meaningful to their organisation from such general level data. Consequently, performance benchmarking at a national level has commenced, providing insights into health system performance within individual countries.7 Various approaches are emerging, including the development of a national health system performance scorecard for New Zealand.8 However, scorecards give an overall snapshot of the health system and the New Zealand scorecard requires further development to enhance its utility at the local District Health Board (DHB) level.

Answering the question of how well an individual DHB performs is far from straightforward. This is partly due to the complexity and scope of DHB activity, but also a historical lack of investment in composite measures for performance measurement along with the range of central agencies monitoring and reporting on different aspects of performance. Moreover, most DHBs gather and report a range of data from the various components of the local health system–from primary care through to individual hospital and health services. While the measurement challenge is one that an impending national Integrated Performance and Incentive Framework (IPIF)9 seeks to address, some DHBs have sought to develop their own measures, including Counties Manukau Health (CMH). This should not ultimately result in a duplication of effort, as the development of the IPIF and CMH’s approach, as described below, has been an interactive process; each has informed the other, aided by involvement of some CMH staff in the IPIF development.

This brief article describes the process undertaken by CMH to develop a set of system-level measures. It aims to raise interest across the DHB sector, both locally and internationally, in performance measurement, using routinely collected data. In doing so, we seek to fill a gap in the field in New Zealand.10

CMH is one of 20 DHBs in New Zealand whose legislated role involves the improvement, promotion and protection of the health and wellbeing of the people in the communities they serve. CMH funds and provides health and disability services for some 500,000 people living in the southern third of Auckland City and in neighbouring Franklin and Papakura districts.

Similar to other DHBs in New Zealand, and health systems in many developed countries, CMH currently has to contend with multiple health care challenges, including an ageing population and increasing chronic illness, resulting in a pressured health care budget. CMH, however, faces the additional challenges of high population growth (between 2-3% annually), the highest birth rate in the country and a very young population with 24% aged 14 years or under.11,12 Māori and Pacific people make up a significant proportion of the population compared to many of the DHBs, with 17% and 23% respectively, and 34% of the population live in areas of deprivation.11 CMH, therefore, faces a situation of being doubly disadvantaged, in that it has to meet the needs of an older population burden by chronic illness, as well as the health care needs of a younger population.

In response, CMH has committed to an integrated health system and services development agenda, and identified a mission “to be the best healthcare system in Australasia by December 2015”. Their intent is to embed a broader range of services within the community via four primary care ‘locality clinical partnerships’. Thus, the DHB’s goal is to build a ‘whole of system’ approach to service delivery that meets the needs of all members of the population, irrespective of health need or disability. Yet to date, how well their health system as a whole meets the diverse pressures it faces and how well the whole of system approach to service delivery is performing is largely unknown. In response, CMH has developed a series of System Level Measures (SLMs) to assess the effectiveness and overall performance of their health system. The SLMs aim to assess performance in relation to health care quality, the integration of care and health care outcomes. They are not intended to induce competition, but rather, help the DHB track its performance on a journey of continual improvement in intentionally selected and measured aspects of the aforementioned dimensions.

System Level Measures: what are they and why use them?

Two organisations have driven the increasing use of quality indicators in health care, The Institute of Medicine (IOM) and the Institute for Healthcare Improvement (IHI).13 These organisations have both advocated for and advanced the area of performance reporting systems in healthcare.13 The IHI was responsible for developing a system of metrics known as Whole System Measures (WSMs)14 which the CMH SLMs are based on. These measures aim to be indicators that are easy to capture and are designed to provide organisational leaders with data that:

  • Show performance of the health system over time
  • Allow the organisation to compare its performance relative to strategic improvement plans.
  • Allow the organisation to compare itself to similar organisations.
  • Contribute to ongoing strategic quality improvement planning.

The WSMs align with the six dimensions of quality identified by the Institute of Medicine. These are that care should be safe, effective, patient-centred, timely, efficient and equitable, as well as reflecting care in different sites across a continuum of care.15 In addition, WSMs link to the Baldrige Health Performance Excellence Framework, which is recognised as a robust method for evaluation of health care systems.16 WSMs are macro-level measures or ‘big dots’, such as Hospital Standardised Mortality Rate (HSMR) and Acute Readmissions to Hospital, and are designed to provide a comprehensive overview of a health systems overall quality and performance. These ‘big dots’ are underpinned by specific measures (‘little dots’) captured at different levels of the system, and known to contribute to the performance of the WSMs. Hence, ‘big dots’ can be decomposed to ‘little dots’ to determine what is influencing performance. The IHI’s WSMs are not a static collection of metrics but are designed to be modified to reflect an organisation’s vision and strategies, as well as its current priority areas.

The journey so far for CM Health

Using SLMs for internal improvement monitoring and external comparison purposes is not unique to CMH. Numerous healthcare organisations internationally, large and small, collect data to measure their performance using WSMs. Examples of organisations applying SLMs include Jönköping county in Sweden, Public Health Wales, and Cincinnati Children’s Hospital in the United States. With the appointment of a director with previous experience of using WSMs to Ko Awatea, CMH’s education and health system innovation and improvement centre, came the opportunity for CMH to utilise this improvement and performance measurement approach.

The journey has three distinct phases. Phase one involved a review of the literature and national and international system level quality frameworks including:

A. IHI Whole System Measures

B. New Zealand Health Quality and Safety Indicators

C. Jönköping County System Measures

D. Public Health Wales System Level Improvement Measures

E. Cincinnati Children’s Hospital Medical Centre System Level Measures

F. The Commonwealth Fund Score Card.

This process assisted in the selection of potential SLMs. The eventual SLMs were chosen based on their ability to inform the monitoring of progress towards CMH’s strategic ‘Triple Aim’17 of improved health and equity for all populations, improved quality, safety and experience of care and best value for public system resources, as well as ability to have a clear logical link to CMH’s strategic objectives. Furthermore, the utility of the SLMs to support monitoring of performance over time, comparisons with other organisations and reinforcement of improvement planning were critical considerations, as was the feasibility of utilising existing data collections within CMH. Basing the selection of the SLMs within these criteria has ensured that all major areas of the health system are covered. There was also an endeavour to ensure that the chosen measures complemented one another; in other words, each is not an isolated metric, but related to multiple other measures. In doing this, CMH can monitor how change in one SLM, for example PHO enrolment rates, contributes to increases or decreases in acute hospital readmissions or the rate of childhood immunisations. To date, a suite of 16 SLMs have been selected and their utility and validity assessed. These measures and their interrelationship are presented in Figure 1 (below) and examples of how they relate to the Triple Aim outlined in Table 1.

Figure 1:  CMH’s system level measures across the continuum of care (Adapted from the IHI Whole System Measures)14

  

 

Table 1: Examples of SLMs and their relationship to the Triple Aim

 

TRIPLE AIM

 

Population Health

Patient Experience of Care

Cost and Productivity

SLMs

Un-enrolled Health Service Utilisation*

Rate of Adverse Events

Healthcare Cost per Capita

Ambulatory Sensitive Hospitalisations

Acute Readmission to Hospital within 28 Days

Workforce Retention 

*Un-enrolled refers to those iin the population who use in-patient services but are not enrolled in a
Primary Health Organisation

Some of the SLMs chosen by CMH are more overtly related to its own controllable actions than others. CMH does, however, have an influence on all the chosen measures including life expectancy and ambulatory sensitive hospitalisations. While the ability to determine exactly the influence of CMH on some measures is more challenging than others, the ability to drill down from the ‘big dots’ to their contributory factors, as discussed next, does allow CMH to uncover potential reasons for changes in the measures overtime. In addition, the drill-down process highlights potential caveats on the data, as well as areas for further research.

Phase two is (at the time of writing in December 2014) underway and ongoing. This phase involves the careful consideration and identification of robust contributory measures (‘little dots’) for each SLM. This process of ‘drilling down’ on each SLM enables identification of measures that influence SLM performance. The on-going nature of this phase is necessary to allow enough time to develop certainty around the appropriateness of the contributory measures chosen. In addition, a dashboard (see Figure 2 for a descriptor of a dashboard and other SLM terminology) has been developed, providing senior managers at CMH a real-time snap shot of the systems performance across the measures selected.

Figure 2: Terminology related to SLMs

System measurement framework: a measurement framework based on the “Triple Aim” that reflects the overall quality and performance of a healthcare system

System Level Measures: these are a set of system measures that aim to evaluate performance on quality and value across a whole system, thereby providing input into strategic quality improvement planning.

‘Big dots’: these are the system level measures and these equate to core processes or functions of the organisations in the system. They are not programme, unit, or disease specific.

‘Little dots’: these are process and outcome indicators at a programme or unit level.

Scorecard: reports on a defined number of measures providing managers with information on the performance of the organisation.

Dashboard: unlike a scorecard which is a snapshot in time, a dashboard uses real time data to assist decision making.

Benchmark: the best result previously achieved by an organisation or department.  A benchmark can be used in conjunction with other comparative data to interpret and evaluate performance and set goals.


In May 2014 phase three commenced. This phase is being undertaken collaboratively with researchers commissioned by Ko Awatea. This third phase has multiple objectives, including the identification of potential health care systems to compare CMH to and establishing appropriate benchmarks. This phase of the project contains various challenges which are discussed next.

Challenges, methodological and operational

Although cross-country comparisons of health system performance have the potential to enhance cross-country learning, there are some well recognised difficulties which make comparisons, particularly international ones, intrinsically difficult.7 These include population variations, definitional issues and coding differences, to name a few.18 However, strategies are available to address these methodological challenges, such as age and sex standardisation of populations, and the use of indicators using internationally standardised definitions for coding.19 There are also innate tensions in deciding which SLMs to collect: ones that allow for cross-country comparisons or ones that are strongly aligned with organisational priorities, or a mixture? Essentially, this is a question of breadth or depth of performance comparison.20 Another tension arises between the need for consistency of definitions, numerators and denominators, yet accepting of a level of flexibility to accommodate comparison with different countries.20 Furthermore, there is also a potential for unintended consequences to emerge when using SLMs to guide performance or even quality improvement. For example, there is a possibility that the focus on the specific SLMs diverts attention and possibly finance from other parts of the system, potentially resulting in misprioritisation.21 However, CMH’s SLMs are philosophically based on an improvement, not a performance judgement, framework. As as the name suggests, they focus on the system. A performance framework, such as the IPIF, tends to use financial and other incentives, such as increased DHB autonomy, to enhance performance in discrete areas. Such approaches can, therefore, result in unfairly focusing attention on a service, process or health outcome.

Closer to home the challenges relate more to operational issues. These include whether the organisation has the technical capability to capture and retrieve the data related to the SLMs of interest. These data frequently come from a variety of repositories, so data linkage can be challenging.22 In addition, analysis of these data requires an understanding of the origins and any related limitations linked to the data, for example, their reliability and the extent to which the data have been validated, before they can be utilised effectively.20,23 The need for a person to lead a team of data analysts and provide strong data collection oversight is, therefore, an essential prerequisite in assuring data quality over time.19 In addition to operational issues, there were also the usual challenges for CMH in terms of deciding which measures to include in a multidimensional framework. These were handled through an iterative and consultative developmental process within CMH’s SLM development group, meaning that there was considerable scope for wide-ranging discussion around measures that were and were not included.

While the literature contains information regarding some of the challenges and unintended consequences, less is written regarding potential supplementary benefits of undertaking this type of initiative. In their article regarding the ancillary benefits of clinical performance measurement, Powell et al1 drew attention to benefits that may equally be related to the implementation of a system of SLMs.21 These include an increase in the pride with which staff view the organisation, increased motivation and increased confidence that the care provided is evidence based. The authors also mention patient benefits, such as increased patient satisfaction, which again may also be an unintended benefit of a health care system monitoring SLMs.

Next steps

The immediate next steps involve collaborating with countries considered appropriate for comparison to assess the feasibility of data comparisons. Once it has been established that data are comparable between health systems, a pilot comparison of one or two SLMs will be conducted and assessed, prior to comparing a larger group of SLMs. It is anticipated that these comparisons will be on-going, assisting learning and quality improvement in all sites.

Conclusion

A core component of any high-performing health system is the employment of a comprehensive measurement system to advance quality improvement. Focusing on SLMs will provide CMH with robust information on the quality and safety of their health services, inform performance improvement strategies within the system and support progress towards the ‘Triple Aim’. As a result, this health system will be one where quality is the result of conscious and responsive design with indicators intended to reflect the core strategic focus of the organisation.

By aiming to compare with other similar health systems internationally, CMH will be provided with opportunities for mutual learning and networking with system performance experts, which is crucial for stimulating improvement. Moreover, by comparing with a similar but acknowledged high-performing sub-national health system, such as Jönköping in Sweden, could allow CMH to develop aspirational benchmarking targets in relation to specific SLMS that are relatively easy to compare. Examples include childhood immunisations status and hospital standardised mortality rates. The leadership at CMH is committed to openly sharing experiences and lessons learned along the way, as well as providing leadership to the wider health sector in New Zealand on the development of SLMs. This work also enables the organisation to contribute to the literature in the field of quality improvement, measurement and evaluation of health systems. It is now time for other DHBs, alliances and other health care providers in New Zealand, to similarly focus their efforts on system improvement aided by system-wide measurement.

Summary

Counties Manukau DHB has adopted a unique approach of using whole-of-system measures to understand how its health system is functioning. The measurements adopted inform quality improvement activity and ensuring that quality improvement initiatives are targeted to the right part of the system. In addition, adoption of these measures allows the DHB to compare itself with similar organisations nationally and internationally.

Author Information

Mrs Fiona Doolan-Noble, Department of Preventive and Social Medicine, University of Otago; Dr Mataroria Lyndon, Ko Awatea, Counties Manukau and South Auckland Clinical Campus, University of Auckland; Mrs Sybil Hau, Ko Awatea, Counties Manukau Health; Professor Andrew Hill, Ko Awatea, Counties Manukau and South Auckland
Clinical Campus, University of Auckland; Professor Jonathan Gray, Ko Awatea Counties Manukau Health; Professor Robin Gauld, Preventive and Social Medicine, University of Otago.

Correspondence

Mrs Fiona Doolan-Noble, Department of Preventive and Social Medicine, University of Otago

Correspondence Email

fiona.doolan-noble@otago.ac.nz

References

 

1.        World Health Organisation. The World Health Report 2000: improving health system performance. Geneva: World Health Organisation, 2001.

2.        Papanicolas I, Kringos D, Klazinga N, Smith P. Health system performance comparison: New directions in research and policy. Health Policy. 2013;112:1-3.

3.        Mays N, Marney J, King E. Fiscal challenges and changing pattern of need for health and long term care in New Zealand. Policy Quarterly. 2013;9(4):35-46.

4.        Cumming J. Integrated care in New Zealand. Int J of Integr Care. 2011;11(18th November).

5.        Organisation for Economic Cooperation and Development. Towards high-performing health systems. Paris: Organisation for Economic Cooperation and Development, 2004.

6.        Fronger B, Anderson G, Hussey P. Multinational comparisons of health systems data, 2005. Commonwealth Fund, 2006.

7.        Veillard J, McKeag A, Tipper B, et al. Methods to stimulate national and sub-national benchmarking through international health system performance comparisons: A Canadian approach. Health Policy. 2013;112:141-7.

8.        Gauld R, Al-wahaibi S, Chisholm J, et al. Scorecards for health system performance assessment: the New Zealand example. Health Policy. 2011;103:200-8.

9.        Expert Advisory Group. Integrated Performance and Incentive Framework. Wellington: Ministry of Health, 2014.

10.     Chaudhry M, Gauld R, Horsburgh S. Hospital quality-of-care performance measurement and reporting: what New Zealand can learn from the United States and the United Kingdom. NZMJ. 2012;125(1366).

11.     Counties Manukau District Health Board. About Counties Manukau District Health Board. Available from: http://www.countiesmanukau.health.nz/about_cmdhb/overview/printpopulationprofile.pdf

12.     Ministry of Health. Report on Maternity, 2010. Wellington: Ministry of Health, 2012.

13.     BC Patient Safety and Quality Council. Measurement strategies for improving the quality of care: A review of best practice. Vancouver (BC): BC Patient Safety and Quality Council, 2010.

14.     Martin L, Nelson E, Lloyd R, Nolan T. Whole of System Measures: IHI Innovation Series White Paper. Cambridge, Massachusetts: Institute of Healthcare Improvement, 2007.

15.     Institute of Medicine. Crossing the quality chasm: A new health system for the 21st century. Washington (DC): 2001.

16.     Foster T, Johnson J, Nelson E, Batalden P. Using a Malcolm Baldrige framework to understand high-performing clinical microsystems. Qual Saf Health Care. 2007;16(5):334-41.

17.     Berwick D, Nolan T, Whittington J. The Triple Aim: Care, health and cost. Health Aff. 2008;27(3):759-69.

18.     Veillard JHM. Performance management in health systems and services: Studies on its development and use at international, national/jurisdictional, and hospital levels. Amsterdam: University of Amsterdam; 2012.

19.     Veillard J, Kadandale S, N K. International health system comparisons: from measurement challenge to management tool. In: Smith P, Mossialos E, Papanicolas I, Leatherman S, editors. Performance Measurement for Health System Improvement Experiences, Challenges and Prospects Cambridge University Press 2010.

20.     Forde I, Morgan D, Klazinga N. Resolving the challenges in the international comparison of health systems: The must do’s and the trade-offs. Health Policy. 2013;112(1-2):4-8.

21.     Powell A, White K, Partin M, et al. More than a score: a qualitative study of ancillary benefits of performance measurement. BMJ Qual Saf online. 2014;23:651-8.

22.     Atalag K, Gu Y, Pollock M. A stocktake of New Zealand’s healthcare datasets. Health Informatics New Zealand Conference; Rotorua, 2013.

23.     Groene O, Kristensen S, Arah O, et al. Feasibility of usiing administrative data to compare hospital performance in the EU.Int J Qual Health Care. 2014; 26(S1):108-15.