23 July 2024

Address to Asian Development Bank, Manila, Philippines

Note

Making development count: evidence for real change

Good morning, it’s terrific to be here at the Asian Development Bank’s headquarters.

Thank you, Rachel Thompson (Asian Development Bank, Executive Director), for the introduction and for facilitating today’s roundtable.

Australia is a founding member of the Asian Development Bank, and like all members, we want to see positive change for the region’s poorest and most vulnerable.

Earlier this year, the Australian Government pledged its support (A$492 million) for the Asian Development Fund 2025–28 round (ADF14) (Wong 2024).

But our membership isn’t limited to the funding side of things.

Member countries also commit to sharing their expertise and practical experience across a range of areas.

And this visit is a great opportunity to share Australia’s experiences on competition policy and evaluation – 2 things relevant to the Bank’s wider objectives on improving living standards and making an impact in a cost‑effective way.

At the Asian Development Bank’s Competition Dialogues later this afternoon, I will share lessons learnt on competition reform from Australia.

And at the Asian Development Bank Institute tomorrow, I will argue that the more we can figure out what works, the better we can make development programs work for everyone – especially for the most disadvantaged.

On a similar note, I welcome the opportunity to make some opening remarks on making development count and using evaluation and data to maximise our efforts.

Data for evaluation

In recent years, ‘we have a seen an explosion in the availability of data’ providing greater opportunities to gather evidence and improve programs through evaluation (Gruen 2024).

Many of us in the research community heard the big talk on big data for many years, but COVID‑19 was a factor in shifting the data dial.

Integrated data assets

COVID‑19 raised a series of urgent public policy questions in Australia that could be tackled only using integrated data assets (Gruen 2024).

And we continue to benefit from that data legacy.

One example of a large‑scale integrated dataset is the Person Level Integrated Data Asset (PLIDA) developed by the Australian Bureau of Statistics (Leigh 2024a).

This dataset brings together data on health, education, government payments, income and taxation, employment, and population demographics (including the Census) over time.

Having this dataset at our fingertips means we can better understand how people travel between services and track the intended and unintended outcomes of policy changes.

As of November last year, there were more than 200 active research projects tapping into these integrated data (Leigh 2024a).

This included projects that aim to support the agriculture sector with better labour demand forecasts for on‑farm workers.

Projects that explore the effects of health and socio‑economic factors driving poor early development outcomes for children.

And projects that aim to produce a stronger evidence base for decisions about disaster mitigation and recovery investments.

These examples demonstrate powerful uses of integrated, large‑scale data assets using traditional research methods.

Data benefits evaluations

Administrative data can also help us gather evidence and conduct more efficient evaluations.

First, administrative data allows researchers to avoid expensive surveys and the low response rates that come with them.

Second, it means researchers can include everyone in evaluations, rather than just the subset who agreed to be surveyed.

And third, it allows policymakers to test different approaches before rolling out system‑wide changes.

For example, Australia’s Department of Employment and Workplace Relations is using the Person Level Integrated Data Asset to test different settings for online employment services.

In partnership with the Australian Centre for Evaluation, it is conducting a series of randomised trials and using integrated data to assess differences in participant outcomes.

In other words, what works to support people to find jobs.

The results aren’t in yet, but the trials have the potential to result in more flexible and targeted support for job seekers and better safeguards for online employment services.

Data in the long run

Administrative datasets can also provide evidence on whether development programs are having an impact over the long run – the big question that keeps us all awake at night.

In one example, researchers crunched data from 5 different administrative datasets to evaluate the PACES voucher program in Columbia (Bettinger et al 2018).

The program began in 1994 and used a lottery to randomly assign vouchers for private secondary schools to applicants from public elementary or primary schools in Colombia’s poorest districts (Bouguen 2018).

More than 20 years later, researchers were able use administrative datasets to compare the educational, financial and job outcomes of the classmates who received the voucher and those who didn’t.

The study found recipients of private secondary school vouchers completed more tertiary education, earned 8 per cent more, and had better credit scores (Bouguen 2018).

Development and evaluation

The Australian Government has committed to strengthening our approach to evaluations as part of our development program (DFAT 2023).

The Department of Foreign Affairs and Trade administers our development program, which focuses our overseas efforts on gender equality, disability and social inclusion (DFAT 2024).

As part of its commitment to continuous improvement, the Department publishes around 40 independently authored evaluations each year.

These evaluations help us learn what works and, just as importantly, what doesn’t work (DFAT n.d).

Evaluations also help inform our investment decisions and provide accountability by demonstrating effectiveness.

Partnerships are a central part of our strategy.

The Department works to commission or support evaluations in partnership with the Abdul Latif Jameel Poverty Action Lab (J‑PAL), the World Bank and the Asian Development Bank.

Philippines – Pantawid Pamilyang Pilipino Program

For example, the Australian Government has supported evaluations of the Pantawid Pamilyang Pilipino Program (4Ps) in the Philippines.

The 4Ps program provides cash transfers to poor and vulnerable households conditional on investments in child education and health as well as the use of maternal health services.

The program was evaluated with a randomised trial of 2500 households with at least one young child (Orbeta et al. 2021).

Children who got the transfer were 7 per cent less likely to be born at a low birthweight and in their first 1000 days they were 3 per cent less likely to experience diarrhea and 4.5 per cent less likely to have a fever.

However, on other health metrics like stunting, wasting and vaccination rates, the evaluation showed the transfer did not have a measurable effect.

A key recommendation of the 4Ps evaluation in 2021 was to ‘strengthen program aspects that influence the first 1000 days to promote better health among young children and pregnant women’.

While not the only factor, the evaluation helped provide the evidence for change and expanded cash grants for pregnant and lactating women have since been approved as a further incentive to access health services (DSWD 2024).

The evaluation also recommended that gaps in updating program data are addressed to ensure newborns and new pregnancies were recorded. Again, not the only driver of change, but more frequent tests on updating data are taking place.

Indonesia – Kartu Prakerja Program

In another example, the Department supported J‑PAL to conduct a randomised trial on the impact of Indonesia’s Kartu Prakerja program.

This new program launched in 2020 and aims to upskill unemployed workers by providing them with subsidies to join training classes or start a business.

The J‑PAL study evaluated the program’s impact on labour, consumption and financial behaviour.

It also compared the effectiveness of different training courses, program targeting processes, and take up.

Indonesia – Program Keluarga Harapan

In a separate study, the Department supported J‑PAL South‑East Asia to complete a randomised trial of Indonesia’s conditional cash transfer program.

The evaluation tested the effect of the program with a sample of 13,600 poor households with children, or a pregnant woman (J‑PAL 2021).

The evaluation found the program had impressive effects including more than halving severe stunting and more than halving the number of children not in school.

Based on this evidence and the evidence of other similar studies, the Indonesian government expanded the program from 5.8 million households to 10 million households.

They also almost tripled the maximum transfer amount that a household could receive.

The evaluation helped policymakers to see that this was an intervention that was having an impact and deserved to be expanded.

Building capabilities in Australia

Building on our international development program, we want evaluations to be part of the broader policy toolkit in Australia.

The Committee for Economic Development of Australia examined a sample of 20 Australian Government programs conducted between 2015 and 2022 (Winzar et al. 2023).

The programs had a total expenditure of over A$200 billion. Yet the think tank found that 95 per cent were not properly evaluated.

The Committee for Economic Development of Australia’s findings are echoed in other major reports – Australian Government evaluation has in the past fallen short.

That’s where the Australian Centre for Evaluation – which I mentioned earlier – comes in (Leigh 2024b).

Established a year ago, the Centre aims to expand the quality and quantity of evaluation across the public service.

One way in which the Centre operates is to strike partnerships with departments, which facilitate ongoing collaboration.

The Centre has also taken an active role in considering aspects that are relevant to all evaluations, such as rigorous ethical review and access to administrative data.

And they’re spreading the word that a high‑quality impact evaluation needs to be built into the design of a program from the outset.

To bring about long‑term change in mindset, there’s a lot to be gained from collaboration with evaluation researchers outside of government.

The Centre is working to change the impression that ‘academics are from Venus, public managers are from Mars’.

So we were delighted to announce the establishment of a new Impact Evaluation Practitioners Network to bring together the best of both worlds.

Closing remarks

Thank you again for the opportunity to open today’s panel discussion.

As I said earlier, I look forward to addressing the Asian Development Bank events over the next couple of days and sharing Australia’s experience on competition policy.

I also look forward to learning more about the Bank’s objectives around evaluating development programs – and this panel discussion is a great way to start.

Thank you.

References

Bettinger E, Kremer M, Kugler M, Medina C, Posso C, Saavedra J. (2018). School vouchers, labor markets and vocational education.

Bouguen A, Huang Y, Kremer M, Miguel E (2018) Using RCTs to Estimate Long‑Run Impacts in Development Economics [PDF], Harvard.

Department of Foreign Affairs and Trade (DFAT) (2023) Australia’s International Development Policy [PDF], published August 2023.

Department of Foreign Affairs and Trade (DFAT) (2024) Australia’s Official Development Assistance Budget Summary 2024–25.

Department of Foreign Affairs and Trade (DFAT) (N.D) Development evaluation web content, accessed July 2024.

Department of Social Welfare and Development (2024) DSWD welcomes PBBM’s approval on expanded 4Ps cash grant for pregnant, lactating women, Media Release issued 12 June 2024.

Gruen D (2024) The Rise of Big Data and Integrated Data Assets Address to the EY Conference, 23 February 2024.

J‑PAL (2021) The Medium‑Term Impact of Conditional Cash Transfers on Health and Education in Indonesia, web content, accessed July 2024.

Leigh A (2024a) From chance to change: leveraging randomised trials and data science for policy success Address to the Human Technology Institute Shaping Our Future Symposium, University of Technology, Sydney 1 February 2024.

Leigh A (2024b) Discovering what works: why rigorous evaluation matters Address to the Address to the Australian Evaluation Showcase, Canberra 17 June 2024.

Orbeta A C, Melad K A M and Araos N V V (2021) Longer‑term effects of the Pantawid Pamilyang Pilipino program: Evidence from a randomized control trial cohort analysis: (Third wave impact evaluation) [PDF]. PIDS Discussion Paper Series, No. 2021–01.

Winzar C, Tofts‑Len S and Corpuz E (2023) Disrupting Disadvantage 3: Finding What Works, CEDA, Melbourne.

Wong P (2024) Australia’s Asian Development Fund pledge delivers for the region Media Release issued 3 May 2024.