15 August 2023

Address to the Data for Policy Summit, Canberra

Note

Data and evaluation: A match made in policy heaven

I acknowledge the Ngunnawal people as traditional custodians of the ACT and recognise any other people or families with connection to the lands and region.

I acknowledge and respect their continuing culture and the contribution they make to the life of this city and this region.

I commit myself, as a member of the Albanese government, to the implementation in full of the Uluru Statement from the Heart, including a constitutionally enshrined Voice to Parliament.

Thank you to Life Course Centre for hosting today’s Summit and thank you for focusing your efforts on the causes of disadvantage in Australia.

Name a better duo is a popular social media caption.

I could name Caitlin Foord and Hayley Raso but the entire Matildas squad is star studded.

I could name Canberra and Spring as a world‑class combination, but the allergy sufferers may beg to differ.

So, today I’m going to break the internet and name data and evaluation as the most dynamic duo.

They’re a match made in policy heaven.

The area we’re talking about is a big deal. According to the Australian Institute of Health and Welfare, Commonwealth and state and territory governments spend over $60 billion a year on services and programs to support individuals and communities (AIHW 2022).

This encompasses a range of services such as family support, youth programs, child care services, services for older people and services for people with disability.

The Life Course Data Initiative – announced in the recent Budget and welcomed by the Life Course Centre – will support investment in places of disadvantage and more targeted support (Life Course Centre 2023).

Last September, Minister for Social Services Amanda Rishworth and I also announced funding for the Australian Bureau of Statistics (ABS) to increase the frequency on collecting data on disadvantage.

On that note, I’d like to thank Dr David Gruen for his earlier remarks on improving integrated datasets and providing greater access for researchers and policymakers.

And I would like to thank my parliamentary colleague Minister Rishworth and the Department of Social Services for their update this morning.

Data hungry

The story of using data to inform policy has a long lineage.

In the mid‑1930s, the Australian Government appointed a Commonwealth Advisory Council of Nutrition to undertake the first national survey of nutrition.

In the midst of the Great Depression, the Council put together a team of health and agricultural experts including scientists, researchers, doctors and dentists.

State committees also played a part.

The Council conducted two major studies in its short time.

The first was initiated via a daily questionnaire published in capital city newspapers asking people about the food they purchased and the food they ate that day. Some participants were invited to continue in the study for up to a year.

The Council ended up analysing the food records of around 1800 households as part of the ‘dietaries’ study (Senate Standing Committee on Social Welfare 1979).

It found Australians of the 1930s were ‘generally well fed’, but questions have since been raised about whether the sample was truly representative of the population (Senate Standing Committee on Social Welfare 1979).

For the second study, the Council sent a doctor out to examine almost 6,000 children as part of ‘a survey of the nutritional state of children in inland areas’ (ABS 1940).

The results of this survey were more concerning. It found ‘a considerable mass’ of malnutrition – between 13 per cent in rural Victoria and up to 24 per cent in New South Wales (O’Connell n.d).

The findings from these studies led the Council to recommend the formation of a child health division in the Department of Health.

It also called for medical supervision in kindergarten and primary schools nationally as well as reduced milk prices.

School milk programs have their genesis in this data collection process.

Today, the linkage between data and policy continues. The National Health and Medical Research Council is currently working with experts to update the Australian Dietary Guidelines. This is based on surveys showing that only 1 in 20 adults and 1 in 17 children met both the fruit and the vegetable daily recommendations.

Data breakthroughs

If data and evaluation are a match made heaven, everyone here would agree that the stairway to get there can take some climbing.

It can be time consuming but worth the effort. Data and evaluation have the power to bring about positive change and I want to share a few examples.

Schools learn to improve

The first relates to the MySchool and National Assessment Program - Literacy and Numeracy (NAPLAN) data.

In 1999, the Ministerial Council on Education, Employment, Training and Youth Affairs released the Adelaide Declaration on National Goals for Schooling in the 21st Century [PDF 26KB] – an agreement to progress towards a higher national standard for schooling, including reporting nationally on school achievement.

It was a long road but more than a decade later in January 2010, the MySchool results went live for the first time. This allowed parents to see how their child’s school or prospective school stacked up against other similar schools. It also allowed school administrators to assess their relative strengths and weaknesses.

Over the years, the MySchool reporting has encouraged schools doing well to continue their good work. For schools that are looking to improve, they can more readily identify exemplars that they can learn from.

Some of those schools shared their inspirations and lessons learnt (ACRA 2017). One principal said their school’s success story was all about data – the school was ‘unaware of severe downward trends in literacy and numeracy that had been occurring over many years’. Another principal said the school identified gaps and restructured intervention programs. Others invested in staff programs and professional learning.

Closing prisons

Another successful example of data‑driven policy comes from the United States and involves addressing concerns about mass incarceration and the need for criminal justice reform (BJA 2012).

For three years, from 2006 to 2008, the United States incarcerated more than 1 percent of adults (Leigh 2020). The human and economic cost of this was enormous.

Launched in 2010, the Justice Reinvestment Initiative used criminal justice data to identify areas of over‑incarceration, assess offender risk, and estimate future prison populations.

By understanding these trends, US policymakers could tailor interventions. They could put in place diversion programs for low‑risk offenders. And they could invest in community‑based rehabilitation and treatment services.

The adoption of the Initiative led to significant reductions in prison populations in various states, saving millions of dollars in correctional costs while maintaining public safety.

For example, Utah used funds to identify alternative justice pathways that proved successful in the past. Increased use of these pathway reduced the use of incarceration and directed people to options that decreased the level of recidivism (BJA 2012).

Oregon took a different direction. It undertook a system analysis of the needs of people in the criminal justice system. And they eliminated expensive duplications of services while also reducing the reliance on prison as a means of behavioural health treatment (BJA 2012).

During the 2010s, the United States experienced a drop in crime and a fall in the incarceration rate. In Republican‑run and Democratic‑run states, prisons were closed. Even with the increase in crime that occurred during the COVID pandemic, rates of violent crime and rates of imprisonment are substantially lower than they were 15 years ago. Data and evaluation have played a critical role.

Kiwi data

Integrated Data Infrastructure – the New Zealand’s Government’s data tool – is also a data success story with more than 730 projects using its microdata in the past decade (Song 2022).

In research on social investment in New Zealand, Life Course’s very own Janeen Baxter and Sara Kalucza said:

While the Integrated Data Infrastructure was established and is run independently of the Social Investment Agency, it was fundamentally a part of the social investment approach to integration of data into policy, and achieving the key goal of identifying and tracking population cohorts over time and the opportunity to evaluate interventions or system‑wide policy changes (Kalucza S and Baxter J 2021).

Integrated Data Infrastructure works by linking or marrying up administrative data from multiple government agencies. Data covers health, justice, labour market, social development, housing and education – and in some cases going back to 1840.

This enables researchers and policymakers to study individuals and populations over time. It also facilitates longitudinal analysis that provides insights into social trends, life events, and policy impacts.

All identifying variables are removed to protect the individual’s privacy prior to release. These data also link individuals via taxation data to the Longitudinal Business Database, a microdata resource about New Zealand businesses.

The comprehensive nature of the Integrated Data Infrastructure system presents many opportunities for policymakers to investigate policy impacts across most portfolios in the New Zealand government. For example, where people are located, how many there are and how great their need is.

In one example, a researcher from the NZ Treasury used a combination of data to examine how health conditions such as diabetes, stroke and cancer can affect people’s income and employment (Dixon 2015).

In another example, the Ministry for Children (Oranga Tamariki) used the Integrated Data Infrastructure to better understand which children in which locations were at risk of ending up in the foster care system (McCann 2019). Efforts were targeted at the locations identified through the dataset.

Opportunities for improved data

Like all relationships, data and evaluation can always improve and work better in tandem. That’s the case for Australia but there is work in progress.

Quality of work

The government’s employment services program, known as Workforce Australia, provides an example.

The program’s main objective is to help job seekers find and maintain secure work. Therefore, it’s reasonable to measure the success of the program on whether it enhances the chance of a job seeker finding a job and remaining employed.

We know from job seeker surveys and employment administrative data whether program participants are employed, but these data do not capture everyone, nor do they always capture the nature of their employment.  

The Department of Employment and Workplace Relations evaluates the impact of the program using exit measures. For example, the Department uses exit from employment services and from income support as a proxy for employment. And it uses the duration of the exit as a proxy for employment sustainability. 

The challenge for policymakers is these exit measures tell us nothing about job quality

Studies tell us that the level of earnings isn’t the only factor that determines the job quality, but it’s undoubtedly an important factor.

A study on the quality of employment young Britons found no universal satisfactory definition of job quality (Orlando 2021). However, it found common traits include a good and fair income, job security and stability, opportunity to progress, a good work‑life balance and one that gives employees a voice.

A US study – titled Not Just A Job: New evidence on the quality of work – found that 60 per cent of survey respondents self‑assessed as having a mediocre or bad job (Rothwell and Crabtree 2019). This study’s list of what constitutes a good job looks very similar to the UK one; pay, job security, opportunity for advancement, benefits, stability and dignity.

The OECD also uses earnings quality as one of the three ways to measure and assess job quality.

A decent life needs a decent income, and a decent level of income comes from decent earnings. Therefore, where possible, earnings should be included as an outcome measure in evaluating the employment services program. 

The inclusion of earnings in evaluating employment services programs can shift the mindset about how these programs are expected to work.

There’s an opportunity to build on the ‘work‑first’ approach that most of the current programs are taking with an aim to moving job seekers off income support.

Using earnings as an outcome allows us to devote more attention and resources to the development of job seekers’ human capital, which is the most important factor determining earnings.

For these reasons, earnings data should be added to the data assets held by the department. This is about enhancing their capability to accurately measure job seekers’ labour outcomes, and consequently to conduct more meaningful evaluation of the program’s impacts.

The Department of Employment and Workplace Relations is working with the Australian Taxation Office and Treasury to explore options to integrate the tax office’s earnings data into employment services data.

Linking tax data to the unemployment system would only provides information on pay – it wouldn’t say much about security, stability, progression, balance, voice or dignity.

However, including tax data could be significant first step towards understanding the level of employment a person has achieved.

In the longer run, better data will improve the quality of the Department’s evaluation work.

The Australian Centre for Evaluation

More broadly, the establishment of the Australian Centre for Evaluation provides an opportunity for evaluation to improve public programs and save taxpayer money.

The Centre will promote the use of high‑quality evaluation across the Australian Public Service. It will partner with government agencies to initiate a small number of evaluations each year. And it will work to improve evaluation capabilities, practices and culture across government.

In particular, the Centre will implement multiple randomised control trials, focusing not just on large trials, but also at quick and economical experiments.

One example of this occurred in 2014, when the Obama Administration pushed for better evidence‑based policy via low‑cost randomised control trials.

In some cases, trials can be low cost because the administrative data has already been collected. In other cases, the intervention can be extremely low cost.

For example, a child substance abuse program in Illinois cost about $100,000 over ten years, but saved around 60 times that much through shorter and fewer foster placements (Coalition for Evidence‑Based Policy 2012).

In another low‑cost example, a New York incentive program paid teachers in low‑performing schools a bonus if results improved. The school board already collected the outcome measures, so the bonus offered to teachers was the only expense. In this case, the admin data clearly showed the lack of effect of the bonus payments (Coalition for Evidence‑Based Policy 2014).

We have seen low‑cost randomised control trials in Australia. In 2016, Rebecca Goldstein and Professor Michael J. Hiscox led a trial to evaluate changes to the School Attendance and Enrolment Measure program in the Northern Territory (Goldstein & Hiscox 2018).

The program was a type of ‘welfare sanction’. In other words, participants could lose income support payments if they failed to comply with conditions – in this case, failing to address a poor school attendance record.

The outlay for this randomised control trial was minimal. It required no materials, no travel and a short time for the researchers to pair like students for the treatment and control groups.

The researchers looked at attendance records for both groups and found ‘no significant differences’ following any of the program interventions. As a result of this evidence, the program was subsequently closed.

Government data

Being able to use data for policy development and efficient service delivery is critical to the operation of the public service.

The Australian Bureau of Statistics is doing its part to enhance the value of their data – of all government data – to expand capability in government to assess need, efficiency and effectiveness.

As chief statistician David Gruen has said ‘while there is always more to do, we are making excellent progress improving the quality and timeliness of administrative data and, more broadly, integrated data assets’ (Gruen 2023).

Another priority at the Australian Bureau of Statistics, right behind continuing to produce high quality primary stats about Australia, is the desire to make secondary products – like BLADE and MADIP – more useful and informative for understanding ourselves.

The government is committed to ensuring the public service has the right capability and tools. Ultimately, this will lead to better policy advice, better regulation, and better services.

Closing remarks

Let me finish with a quote that I like:

‘The principal value of evaluation is that it improves the decision‑making process by providing a rational base from which judgements may be made’ (Senate Standing Committee on Social Welfare 1979).

These wise words aren’t new, and they don’t come from overseas. They were the words of a 1979 report in a Parliamentary Committee inquiry into evaluation in Australian Health and Welfare Services.

The Parliamentary Committee said, ‘Without evaluation we cannot know whether a particular program is achieving anything at all or whether, for example, its effects are the reverse of its stated objectives’.

The first chapter of the Committee report is devoted to what evaluation is, and why evaluation is such a good thing.

The four essentials for any program – and its evaluation – are unchanged today:

  • a statement of needs
  • a statement of objectives or goals
  • a statement of the criteria by which success will be judged
  • and a database of evidence to measure these criteria.

The Committee went on to make a series of recommendations to improve data, develop indicators and spend more on health and welfare statistics. 

The final recommendation sounds a bit like the Australian Centre for Evaluation’s objectives.

The Committee called for a Secretariat to prepare a document outlining the methods available to organisations for the evaluation of their activities.

And it recommended that the Department of Health and Social Security provide a consultancy service, free of charge, to enable organisations receiving health and welfare grants from the Federal government to evaluate their own activities.

My point is that not a lot has changed in the theoretical approach in almost half a century. Data and improved evaluations have been a long‑standing issue and a vital one.

However, what has changed is the data and digital environment – the capacity to generate, synthesise, analyse and share data analytics at scale, and in real time, has fundamentally shifted.

Funnily enough, the 1979 Senate Committee’s report was titled Through a Glass, Darkly in referring to Saint Paul’s point that we can’t see clearly now but if we get things right all will be clearer in heaven (I Corinthians 13:12).

So, data and evaluation really are a match made in policy heaven.

References

Australian Bureau of Statistics (ABS) 1940 ‘Year Book Australia, 1939 [PDF 885KB]’ Chapter IX, Public Hygiene, page 222.

Australian Bureau of Statistics (ABS) 2018 National Health Survey: First results [Reference Period 2017‑18 financial year], released 12 December 2018.

Australian Curriculum, Assessment and Reporting Authority (ACRA) 2017 School perspectives: Principals of schools achieving substantially above average gain in NAPLAN 2017.

Australian Institute of Health and Welfare (AIHW) 2021 Welfare expenditure, Australian Institute of Health and Welfare.

Bureau of Justice Assistance (BJA) 2012 Justice Reinvestment Initiative.

Coalition for Evidence-Based Policy 2012 Rigorous Program Evaluations on a Budget: How Low‑Cost Randomized Controlled Trials Are Possible in Many Areas of Social Policy [PDF xxKB], Coalition for Evidence‑Based Policy.

Coalition for Evidence‑Based 2014 Policy New York City Teacher Incentive Program [PDF 20KB], Coalition for Evidence-Based Policy.

Dixon S 2015 The employment and income effects of eight chronic and acute health conditions Treasury [PDF 906KB], New Zealand Government.

Goldstein R and Hiscox M 2018 School Enrolment and Attendance Measure Randomized Controlled Trial: Full Report National Indigenous Australians Agency.

Gruen D 2023 Making the Most of Administrative and Integrated Data Assets [Speech delivered 3 August 2023 Australian Workshop on Public Finance]. 

Kalucza S and Baxter J 2021. Social investment: The New Zealand case [PDF 670KB]. p 24‑37 In Chan, J. & Saunders, P. (Eds.), Big data for Australian social policy: Developments, benefits and risks. Academy of the Social Sciences in Australia, Canberra.

Leigh, Andrew. "Estimating Long‐Run Incarceration Rates for Australia, Canada, England and Wales, New Zealand, and the United States." Australian Economic History Review 60, no. 2 (2020): 148‑185.

Life Course Centre 2023 Australian Government Life Course Data Initiative Life Course Centre.

McCann D 2019 Experiences of Education for Children in Care: Part 2: Review of New Zealand Government Data [PDF 802KB]. Wellington, New Zealand: Oranga Tamariki‑Ministry for Children.

O’Connell J n.d Commonwealth Advisory Council on Nutrition, Australian Food Timeline. Accessed August 2023.

Orlando C 2021 ‘Not just any job, good jobs!’: Youth voices from across the UK [PDF 2.7MB] Institute for Employment Studies.

Rothwell J and Crabtree S 2019 Not Just a Job: New Evidence on the Quality of Work in the United States [PDF 6.0MB] Lumina Foundation.

Senate Standing Committee on Social Welfare 1979 ‘Through a glass, darkly: Evaluation in Australian Health and Welfare Services’ Volume One, Australian Government Publishing Service, Canberra.

Song Z 2022 How big data can be a force for good - NZIER Insight 103 New Zealand Institute of Economic Research.