5. Embedding interventions into a strengthened monitoring and evaluation system

The implementation of rigorous Monitoring and Evaluation (M&E) systems is key for the improvement of policy making (Lonean, 2020[1]). All decisions and implementation processes throughout the policy making cycle may be informed by and benefit from policies implemented earlier or elsewhere that have been proven to lead to the desired results and objectives (or, alternatively, where reasons for their failure are understood) thanks to careful monitoring and evaluation processes. Besides supporting learning and the development of better policies, the M&E of public policies also contributes to accountability, as it provides detailed information on how policies are planned and implemented and can help promote specific results to all relevant stakeholders.

Strengthening accountability through M&E is particularly important for the youth policy arena, given the cross-sectional nature of youth policy and the numerous interactions established with other sectoral policies (i.e. education, employment, social inclusion, health, etc.). However, the interconnectedness of the sectors and the intersectionality of risk factors and needs make it more difficult to monitor policies. In addition, the monitoring and evaluation of interventions that are meant to have a preventive impact in the medium and long term require longitudinal data collection across a variety of areas, including labour market and education outcomes. While this is true for all preventive interventions and not just in the youth policy area, the latter faces additional constraints related to: (i) obtaining consent for data collection from minors and their parents; (ii) the fact that young people tend to be more geographically mobile than the general population and so may be more difficult to track and convince to participate; and (iii) the practical and potential ethical difficulties in varying access to interventions that are thought to be beneficial.

This section reviews four axes to strengthen M&E systems and their impact on youth policy making, focusing on: (1) strengthening the data infrastructure, including to capture intersectionality; (2) putting in place a robust ethical oversight and data privacy considerations; (3) promoting the use of high-quality data for M&E; and (4) disseminating M&E results.

The quality and availability of data (survey data, administrative data, programme implementation data, etc.) is a key factor for how easily a policy can be monitored and evaluated and how rigorous the resulting evaluation can be (OECD, 2019[2]). Indeed, evidence-informed policy making can be hindered by a lack of adequate information and by capacity gaps among government departments and agencies to generate such information in a format that is suitable for evaluation purposes. It is key for policy analysis and evaluators to understand what evidence currently exist across institutions, what information such resources cover, and how it can be accessed.

Australia benefits from a strong commitment to evidence-based policy making and investments in strong data infrastructure. The country scores highly on the OECD OURdata Index, which rates countries according to data availability, data accessibility, and government support for data re-use. Australia ranks sixth in the 2019 index, with above average scores on all three indicators. Australia is one of the leaders in terms of promoting government data re-use both within and outside the public sector and the country made one of the most noticeable improvements in data availability among OECD countries since 2017 (OECD, 2020[3]).

The availability of timely and comprehensive longitudinal data can support the monitoring and evaluating of “what works” in preventing that young people end up not in education, employment or training (NEET). The relevance of the availability of longitudinal data for this sort of research is shown by a review commissioned by England’s Department of Education to collate and synthesise available evidence on how interventions targeting young people at key transition points (mainly aged between 14 and 16) can lead to future improvements in education and employment attainment (Learning and Work Institute, 2020[4]). Out of the 58 studies analysed in the review, some of which are cited in the previous chapters of this report, almost two-thirds made use of longitudinal data to evaluate the effects of the intervention on the outcomes of interest.

The fact that Australia is further improving the generation of high-quality longitudinal datasets is demonstrated by the Multi-Agency Data Integration Project (MADIP), which through the integration of data from multiple sources makes it possible to identify outcomes for young people across a variety of dimensions (allowing for the identification of intersectionality) and follow them over time (Box 5.1).

As of today, a considerable number of projects make use of the MADIP data for evaluation purposes. A key example is the VET National Data Asset project “Measuring the outcomes of VET students”, which integrates data from MADIP and the Business Longitudinal Analysis Data Environment (BLADE) to enhance the evidence base around employment and social outcomes of VET students in Australia. Similarly, the Post-School Destinations Project leverages the data from MADIP and combines it with assessments from the National Assessment Program – Literacy and Numeracy (NAPLAN) to investigate the post-school destinations and outcomes for students at the national and state/territory level, especially those from disadvantaged backgrounds.

In addition to administrative data, cohort studies covering children and young people, including the multiple cohorts of the Longitudinal Surveys of Australian Youth (since 2003 recruited from schools taking part in PISA), are also available and provide complementary information that cannot be found in administrative records.

Despite the advances made in this arena, however, cohort specific research is still lacking for the assessment of the influence of interventions focused on the “middle years” (i.e. from 8 to 15) on NEET prevention.

A leading example of integrated national administrative data is New Zealand’s Integrated Data Infrastructure (IDI), established and maintained by Stats NZ. Data in the IDI form an “ever-resident” Aotearoa New Zealand population of around 9 million people and their households (Jones et al., 2022[6]). The aim of the IDI, which was established in 2011, is to provide a research tool to understand complex social and economic issues in more depth, to inform policy, to help with the targeting of resources, and to undertake impact evaluations. The IDI links data from different government agencies, statistical surveys, and non-government organisations, and enables researchers to compare population outcomes across a wide range of variables, including education, income, benefits, migration, justice and health, and allows for the adoption of an intersectionality lens in the design of policies.

As set out in Jones et al. (2022[6]), New Zealand has unique characteristics that may not be easily transferable to other contexts but their data integration process can provide interesting lessons for other countries wanting to implement a similar approach.1 The split accountability between federal and state governments in Australia may make such integration more difficult and may require additional policy or legislative structures to give data suppliers confidence that information will not be used to the detriment of the people they serve. Key elements in the NZ data integration process have been solid infrastructure design, political support, a strong regulatory environment, good data quality, close collaboration with analysts, and last, but not least trust.

Generating and maintaining a high-quality data infrastructure relies on strong ethical oversight that guarantees privacy and delivers value. Individuals’ trust in the collection and use of their data is intrinsically dependent on the level of data security and the extent to which data are used for their benefit and the development of effective policy making.

The success of New Zealand’s IDI is entirely dependent on the safe and ethical use of the data it contains. Two frameworks guide decision making about access to the IDI: Five Safes and Ngā Tikanga Paihere (a collection of Māori customary behaviours). Both frameworks are intended to ensure data are treated in responsible and culturally appropriate ways (Jones et al., 2022[6]). All applicants and their proposed research must meet the Five Safes conditions (safe people, projects, settings, data and output) and demonstrate they have appropriate cultural safeguards in place to conduct research in a way that will be beneficial to Māori and other priority populations (e.g. people with disabilities).More generally, any use of data or information about people, families and communities (whether it can identify people or not) must be done in a safe, transparent, and trustworthy way. Not only will this approach help to increase public trust and confidence in governments’ legitimate role in the collection, processing and use of data, it will contribute to the design and delivery of more effective, user-centred policies and services. Data use, including the decisions and actions that derive from it, should prevent, avoid, or at the very least limit intentional harm. It should not lead to or perpetuate discrimination. It should instead promote inclusion, respect diversity, and ensure that individuals and communities are treated and benefit equally from the outcomes a data-driven public sector aims to deliver (OECD, 2020[7]).

For instance, the New Zealand Government undertook an extensive public engagement process in 2019 to create the Data Protection and Use Policy (DPUP) (Social Wellbeing Agency, 2021[8]). The engagement process revealed a complex landscape of privacy legislation, regulation and rules that people struggled to navigate. The DPUP provides clear and practical guidance (principles, guidelines and a toolkit) on how personal information can and cannot be used in the social sector to provide confidence to those collecting and using the data, and to those to whom the data belongs.

There are many examples of good practice guidance for governing and managing data ethics and privacy including for example, the OECD’s The Path to Becoming a Data-Driven Public Sector (OECD, 2019[2]). Australia has its own best practice examples to draw on like, for example, the public consultation on the proposed data sharing and release legislation in 2019 (Australian Government Department of the Prime Minister and Cabinet, 2019[9]) and how the results of that engagement shaped the final legislation, the Data Availability and Transparency Act 2022. This Act establishes a new, best practice scheme for sharing Australian Government data, underpinned by strong safeguards and consistent, efficient practices.

The approach chosen to identify and address any risks associated with use of personal information for M&E purposes – whether it be guidelines, a framework and/or processes – must comply with relevant legislation, policies and guidance and be able to accommodate a range of data uses including new and emerging uses, for example using data to identify and target young people in greatest need of intervention, i.e. predictive risk modelling. A framework (and/or guidelines) that steps a decision maker through relevant technical, ethical, privacy, public interest (and other) considerations at different stages of a project would support an assessment of whether any risks outweigh the benefits or can be sufficiently and safely mitigated. A decision might be to refer the project to a relevant ethics committee.

The existence of a strong data system can make the creation of a sustainable M&E system much easier, but it alone is not a sufficient condition. Among other factors, government departments, agencies, and programme providers need to either have the capabilities themselves to use the existing data to undertake M&E tasks, or to outsource this task – but they would still need to understand the results and ensure that they are reflected in evidence-based policy making.

In principle, Australia benefits from a long-embedded M&E culture. The country’s evaluation strategy driven by the Department of Finance that was in place from 1987 to 1997 and was complemented by more attention to monitoring from 1995 onwards is a well-known example of evidence-based decision making (Mackay, 2011[10]). More recently, the Public Governance, Performance and Accountability Act 2013 emphasised the importance of performance reporting.

Despite this legal framework, different institutions make the criticism that current M&E efforts are neither frequent enough nor of sufficient quality (Bray and Gray, 2019[11]). In particular, for policies that concern young people, the scope of M&E activities in Australia appears to be mixed. On the one hand, evaluations exist of a number of different youth-related programmes, including three evaluations of the 2009 National Partnership on Youth Attainment and Transitions, and of the National Support for Child and Youth Mental Health Programme. On the other hand, the Australian Government’s new Youth Engagement Model, which will establish an Office for Youth and ensure young people from diverse and at-risk cohorts are represented in consultations and engagement with government, does not make any explicit reference to the development of an M&E strategy.

Different strategies are being used to encourage the uptake of M&E practices to inform policy making across OECD countries. For example, in the United States, the federal government sought to increase the use of evidence in policy making across all federal agencies, acknowledging that some agencies were already excellent at using evidence while others lacked the skills or capacity necessary. As a result, the Foundations for Evidence-Based Policy making Act was approved in 2019. The law pushes agencies to adopt stronger evaluation practices in order to generate more evidence about what works and what needs improvement and establishes that any data collected should be made accessible across agencies and to external groups for research purposes.

Other countries have opted for creating a dedicated team or agency to evaluate policies across the board. A key example is the UK Cabinet Office’s “What Works Network”, which is aimed at improving the generation, sharing and use of high-quality evidence within the government (Box 5.2). In the same line, the US Office of Management and Budget’s Evidence Team acts as a central hub of expertise on setting research priorities and selecting appropriate evaluation methodologies in federal evaluations, and in Korea, the government Performance Evaluation Committee is responsible for evaluating the policies of central government agencies on an annual basis (OECD, 2022[12]). Other relevant examples include Mexico’s National Council for the Evaluation for Social Development Policy, which is a decentralised public body responsible for generating objective information and undertaking evaluations across a wide range of social policies.

A robust M&E function cannot be complete without the results of M&E activities being made available to their intended users. M&E results need to be communicated and disseminated to key stakeholders, as making evaluation results public is an important element in ensuring their impact and increasing the use of findings for evidence-based policy making (OECD, 2020[15]).

An increasing number of OECD countries make evaluation results public, encouraging openness and transparency in the public sector. For example, all evaluations commissioned in Poland, including those concerning the implementation of EU funds, must now be made accessible to the public. For this purpose, a national database of evaluations has been created, and their findings are published on a dedicated website. This repository currently shares the results of more than a thousand studies conducted since 2004, as well as a series of methodological tools and other resources aimed at evaluators. A similar approach has been taken by Norway, which has created a dedicated, public website to gather all the findings of evaluations carried out by the central government. The portal, operated by the Directorate for Financial Management and the National Library of Norway, contains evaluations carried out on behalf of government agencies since 2005, as well as a selection of central evaluations conducted between 1994 and 2004. The website also provides knowledge-sharing resources, such as evaluation guidelines, a calendar of key activities in the evaluation area, and external links to professional papers and other resources of interest. In the same manner, the Institute of Education Science (IES), the research arm of the United States Department of Education, has set up a web portal called the What Works Clearinghouse (WWC), whose main objective is to facilitate policy makers, researchers and other education practitioners learn about policies and interventions that have a proven impact on improving students’ outcomes. The WWC collects evidence through a rigorous systematic review methodology, and results are presented through an interactive portal where users can sort by type of intervention, desired outcomes and effectiveness of the intervention.

Concerning child rather than youth outcomes, the European Platform for Investing in Children (EPIC) is an evidence-based online platform that consolidates information on policies for children and their families in Europe. The main objective of the platform is to serve as a tool to monitor activities implemented across member states triggered by the Recommendation for Investing in Children, whose main objective is to encourage member states to implement multidimensional policies to tackle child poverty and social exclusion in Europe. It also helps as a repository for sharing the best of policy making for children and families and to foster co-operation and mutual learning in the field (Box 5.3).


[5] Australian Bureau of Statistics (2022), Multi-Agency Data Integration Project (MADIP), https://www.abs.gov.au/about/data-services/data-integration/integrated-data/multi-agency-data-integration-project-madip (accessed on 16 November 2022).

[9] Australian Government Department of the Prime Minister and Cabinet (2019), Data Sharing and Release: Legislative Reform, https://www.datacommissioner.gov.au/sites/default/files/2022-08/Data%20Sharing%20and%20Release%20Legislative%20Reforms%20Discussion%20Paper%20-%20Accessibility.pdf.

[11] Bray, B. and M. Gray (2019), Evaluation and learning from failure and success: An ANZSOG research paper for the Australian Public Service Review Panel, https://www.apsreview.gov.au/sites/default/files/resources/evaluation-learning-failure-success.pdf.

[16] European Commission (2022), European Platform for Investing in Children (EPIC), https://ec.europa.eu/social/main.jsp?catId=1246&langId=en (accessed on 16 November 2022).

[6] Jones, C. et al. (2022), “Building on Aotearoa New Zealand’s Integrated Data Infrastructure”, Harvard Data Science Review, https://doi.org/10.1162/99608f92.d203ae45.

[4] Learning and Work Institute (2020), Evidence review: What works to support 15 to 24-year olds at risk of becoming NEET?, Learning and Work Institute, https://learningandwork.org.uk/wp-content/uploads/2020/04/Evidence-Review-What-works-to-support-15-to-24-year-olds-at-risk-of-becoming-NEET.pdf (accessed on 16 November 2022).

[1] Lonean, I. (2020), Insights into youth policy evaluation, Council of Europe and European Commission, https://pjp-eu.coe.int/documents/42128013/47261953/122018-Insights_web.pdf/99400a12-31e8-76e2-f062-95abec820808 (accessed on 16 November 2022).

[10] Mackay, K. (2011), “The Australian Goverrnment’s M&E System”, PREM Notes and Special Series on the Nuts and Bolts of Government M&E Systems, No. 8, World Bank, Washington, D.C., https://documents1.worldbank.org/curated/en/577181468220168095/pdf/643850BRI0Aust00Box0361535B0PUBLIC0.pdf.

[12] OECD (2022), Evolving Family Models in Spain: A New National Framework for Improved Support and Protection for Families, OECD Publishing, Paris, https://doi.org/10.1787/c27e63ab-en.

[15] OECD (2020), “Building Capacity for Evidence-Informed Policy-Making: Lessons from Country Experiences”, OECD Public Governance Reviews, OECD Publishing, https://doi.org/10.1787/86331250-en.

[7] OECD (2020), Good Practice Principles for Data Ethics in the Public Sector, OECD, Paris, https://www.oecd.org/gov/digital-government/good-practice-principles-for-data-ethics-in-the-public-sector.pdf (accessed on 3 April 2023).

[3] OECD (2020), OECD Open, Useful and Re-usable data (OURdata) Index: 2019, OECD, Paris, https://www.oecd.org/governance/digital-government/ourdata-index-policy-paper-2020.pdf (accessed on 16 November 2022).

[2] OECD (2019), The Path to Becoming a Data-Driven Public Sector, OECD Digital Government Studies, OECD Publishing, Paris, https://doi.org/10.1787/059814a7-en.

[8] Social Wellbeing Agency (2021), Data Protection and Use Policy (DPUP), https://www.digital.govt.nz/standards-and-guidance/privacy-security-and-risk/privacy/data-protection-and-use-policy-dpup/.

[13] UK Government (2022), What Works Network, https://www.gov.uk/guidance/what-works-network (accessed on 16 November 2022).

[14] What Works Network (2018), The What Works Network - Five Years On, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/677478/6.4154_What_works_report_Final.pdf (accessed on 16 November 2022).


← 1. The article by Jones et al. (2022[6]) provides a brief history to the development of the IDI and shares some lessons learned along the way for the benefit of others starting down this integrated data journey.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2023

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.