3. Knowledge governance and promoting the systematic use of evidence

The complexity of education systems requires particular attention to knowledge processes. Data and other information relevant for decision making is collected and needed at potentially different times and places. It is often produced in some form and needed in another to inform decision making. Actors have diverse responsibilities and roles and may produce some information and require other in their decision-making processes (Burns, Köster and Fuster, 2016[1]).

On the supply side, knowledge governance means bringing together varied, adequate and relevant information and knowledge. This includes producing knowledge directly, for example through policy experimentation, piloting and evaluation, as well as collecting and consolidating administrative and performance data. It includes facilitating knowledge production, for example by shaping funding channels or otherwise incentivising research activities and evidence production (Langer, Tripney and Gough, 2016[2]). For information to be useful for decision making in policy and practice, decision makers need to transform information into actionable knowledge. Knowledge pertains to “assimilated information and the understanding of how to use it” (Hess and Ostrom, 2007, p. 8[3]).

Decisions in professional and policy-making contexts draw on a wide range of knowledge and complex considerations so that evidence cannot be the only factor driving decisions. Policy decisions are embedded in value-driven political context and may have no or multiple technically ‘best’ solutions (Newman and Head, 2017[4]). Teachers’ and other professionals’ decisions in classrooms and workplaces are directed by a vast amount of practical and tacit knowledge. This allows decision makers to adapt to local contexts and puts them in the best position to identify the sources of evidence, data and other information necessary for the specific decision-making challenge at hand. However, decision makers may also lack the opportunity or motivation to integrate new information into their knowledge or to move beyond familiar approaches. They may lack the capability to do so effectively and efficiently. They may follow beliefs about what works or engage in cognitive shortcuts (heuristics) rather than drawing on systematic investigation to gather adequate evidence and decide on a course of action in a given context (Fahey and Köster, 2019[5]; Burns, Köster and Fuster, 2016[1]).

In the context of this complexity, the demand side of knowledge governance means promoting decision makers’ capability, motivation, and opportunity to consider evidence systematically when making decisions. Using evidence systematically means considering evidence beyond where it may align with preconceived notions or in specific situations.

Evidence pertains to the product of any “systematic investigative process employed to increase or revise current knowledge” (Langer, Tripney and Gough, 2016, p. 11[2]). This includes formal research, for example as carried out by research institutions, government agencies or think tanks; systematically gathered understandings from education practice and the practice of policy making, implementation, and evaluation; as well as factual administrative and achievement data (Langer, Tripney and Gough, 2016[2]).

Motivation, capability and opportunity are all needed to make use of evidence to produce actionable knowledge. Decision makers will not use evidence if they are not motivated to do so; they will not do so if they do not know how; and they will not consider evidence in their decisions when they do not have the opportunity to do so. Conversely, the three components can promote one another. For instance, the opportunity or the capability to use evidence can strengthen the motivation to do so. Moreover, using evidence successfully for a decision can expand knowledge to engage meaningfully with evidence and can motivate a more systematic use of evidence. Capability pertains to the necessary knowledge and skills to engage in the use of evidence. Motivation includes the habits as well as active decisions to use evidence. Opportunity refers to all external factors that make evidence use possible or prompt it, such as the access to a data warehouse to explore evidence and the time to do so (Michie, van Stralen and West, 2011[6]). Based on empirically observed mechanisms (Langer, Tripney and Gough, 2016[2]), the OECD Strategic Education Governance project identifies five areas to promote the capability, motivation, and opportunity to use evidence in decision making (Michie, van Stralen and West, 2011[6]). These areas pertain to: the skills to access and makes sense of evidence, making evidence conveniently available, organisational processes encouraging the use of evidence, collaboration with evidence producers and collegial exchange, and building a common understanding of the importance of evidence, which evidence is useful and how its best used (Figure 3.1).

The more proficient decision makers are in using evidence the more likely they are to use it systematically and to greater effect, as well as comprehend this evidence better in practice. Building skills for using evidence pertains to fostering the individual capability to access and make sense of evidence. This includes the skills for locating, appraising, and synthesising evidence to integrate it with other information and particular needs. Appraising evidence pertains to examining research systematically and critically, with the aim to judge its trustworthiness and its value and relevance in particular contexts (Langer, Tripney and Gough, 2016[2]; Hyde et al., 2000[7]).

Inroads to foster such skills pertain to initial education, dedicated professional training offers and other continuing education formats. This includes external offers. External training includes university courses (individual and degree), university and professional accreditations, and short courses offered by other providers. Requisite skills may be fostered also through mentoring and coaching efforts, structured exchange among colleagues, and learning platforms/ e-learning offers (Abdullah et al., 2014[8]; Chambers et al., 2011[9]). Supervisors with the skills to supervise their staff’s use of evidence are more likely to motivate staff to make better use of evidence and to acquire relevant skills. This includes dedicated training offers for supervisors, continuing education formats (including external offers), mentoring/ coaching, exchange with colleagues, and learning platforms/ e-learning. To supervise effectively the use of evidence by their staff, supervisors need suitable resources comprising relevant physical and organisational resources and requisite time. Specific resources include education and training requirements, instructions/ assistance for conducting employee reviews or reflection and planning discussions, and staff development measures (Langer, Tripney and Gough, 2016[2]; Gray et al., 2012[10]).

Making evidence available pertains to communicating evidence and providing decision makers with convenient access to evidence. Making evidence available effectively increases the usefulness of evidence and makes evidence more likely to be used in decision making. Evidence can be communicated directly, for example through newsletters, publications, handouts, research teasers or research summaries (Cordingley, 2016[11]). Making evidence available also includes providing access to evidence, for instance through databases or evidence repositories. Databases may contain various information such as administrative or performance data and may connect these data within or across organisations. Evidence repositories are collections of consolidated evidence compiled centrally, often online, providing an organised body of related information (Langer, Tripney and Gough, 2016[2]).

Indiscriminate communication of evidence can be inefficient because it is difficult for decision makers to identify the most relevant elements for their practice. Targeting evidence to those to whom it is most relevant avoids burdening decision makers. For instance, relevant target groups at the school level may include school leaders, administrators, staff in the area of quality development, and teachers. Evidence may be targeted to students and parents as well. Within target groups, tailoring evidence to decision makers’ preferences and work habits increases convenience and personal salience. Relevant dimensions for tailoring evidence include differences in experience and capabilities in dealing with evidence, interests in content/ topics, language style and proficiency, and preferred information channels such as newsletters, handouts, databases, and evidence repositories. Approaches that can be used to target and tailor evidence include consulting experts when designing knowledge resources and communication strategies; feedback discussions with users, surveys among decision makers regarding preferences and needs; consulting prospective users when designing knowledge resources; and collaborating with decision makers in making evidence available (Noar, Benac and Harris, 2007[12]; Kreuter and Wray, 2003[13]; Langer, Tripney and Gough, 2016[2]).

Collaborating with decision makers in making evidence available allows shaping communication techniques, modes of access and presentation of evidence in a manner that is most relevant for their needs. It can further help building ownership of evidence and foster future appetite for closer commitment with evidence production. Respective efforts should be mindful of time, effort and commitment required to avoid overburdening decision makers. Collaborating with decision makers includes collaboration in developing content, soliciting content (such as contributing texts), and involvement in evaluating communication techniques, tools and knowledge resources (O’Mara-Eves et al., 2013[14]; Langer, Tripney and Gough, 2016[2]).

Organisational processes encourage and support evidence use. This includes integrating evidence use in existing processes and structures, making decision-making processes transparent and comprehensible, inviting diverse perspectives into decision making, and establishing knowledge management systems.

Use of evidence should become a routine and fluent practice. Efforts to strengthen the use of evidence should be integrated into existing decision-making processes to maximise the opportunity to use evidence directly when the need arises and motivate decision makers to use available evidence. Conversely, introducing additional structures can be ineffective where they do not fit existing processes and habits, such as introducing a knowledge broker in an environment where evidence use is already high. Shaping organisational processes and structures should include considering opportunity costs in terms of required (prior) investments in time, resources, and skills of decision makers. To promote take-up and sustainability of efforts to promote evidence use, opportunity costs should be proportional to the benefits associated with the efforts. Incremental changes in decision-making processes and structures may be more cost-effective and sustainable than more ambitious alternatives (Bunn and Sworn, 2011[15]; Langer, Tripney and Gough, 2016[2]).

Transparent and comprehensible decision-making processes make clear how decisions were reached and the assumptions made and evidence used in doing so. Highlighting how and which evidence has been considered in a given decision motivates the use of evidence. Specific tools include (publicly) accessible documents and exchange formats that discuss decisions and how they were reached. Exchanges may be among internal stakeholders, for example among the school leadership team. Exchanges may also be with external stakeholders such as the school inspectorate engaging in exchange with schools or the school exchanging with school partners (Harvey et al., 2002[16]; Nutley, Walter and Davies, 2007[17]).

Inviting a range of perspectives, experiences, and knowledge into decision-making processes can strengthen systematic use of evidence by motivating consideration of different sources of evidence. Embedding organisational instruments into efforts to enhance transparency of decisions and invite diversity in decision making is relevant to support these processes. In particular, this support can come in the form of tangible means such as procedures, protocols, or other tools employed within an organisation (Durand et al., 2014[18]).

Knowledge management serves to identify, store and link existing knowledge and to create new knowledge so that decision makers can use knowledge in a goal-oriented way. Knowledge management is relevant to support evidence use and decision making by providing links between evidence and other existing knowledge. Organisational knowledge management should align with decision-making processes to make the best possible use. Knowledge management can be online, such as online platforms and databases, as well as offline, for instance in the form of a local collection of printed handbooks, teaching materials, or organisational documents. Knowledge management can be organisation-wide or provide access to knowledge across multiple organisations, such as schools, school clusters, or education regions (Quinn et al., 2014[19]).

The exchange between decision makers and providers of evidence and the exchange between decision makers among themselves can provide important impulses to promote the systematic use of evidence (Kothari, Birch and Charles, 2005[20]). Interaction between evidence providers and decision makers can facilitate reaching common understanding of evidence and help evidence providers to gather information on the expectations placed on evidence by decision makers. It can motivate decision makers’ use of evidence through social influence and promote the ease of use by helping decision makers understand provided evidence and understanding how to integrate it in professional processes (Langer, Tripney and Gough, 2016[2]; Kim et al., 2015[21]; Nutley, Walter and Davies, 2007[17]).

Interaction between decision makers can facilitate learning from each other and contribute to building professional standards of what fit-for-purpose evidence looks like. For example, decision makers may engage in regular meetings to discuss research in scientific journals and their application in respective professional practice (“journal club”) (Harris et al., 2010[22]). Interaction among decision makers may also be targeted to develop a common understanding of how evidence should be used in specific decision making situations. For instance, this can take the form of professionals engaging in shared learning of new or better ways of undertaking practice and how evidence fits into these processes (“joint practice development”) (Hargreaves, 2011[23]; Sebba, Kent and Tregenza, 2012[24]).

Three approaches promise to strengthen the potential of interaction to support evidence use apply to both interaction among decision makers and to exchange and collaboration of decision makers with evidence providers. First, interaction should be structured around an explicit purpose. This includes organising structured events like workshops and providing structured time for interaction within the organisation, for example within network meetings. Second, interaction should favour more frequent, relatively low-threshold efforts over relatively fewer, more ambitious efforts. Low-threshold exchanges offer lower potential for frictions and may be timelier by offering an opportunity to contribute to current decision-making challenges. This includes informal regular exchanges between peers, for example integrated in regular work meetings, or mentoring or coaching relationships. Third, approaches should minimise opportunity costs by supporting exchanges. This includes providing the time and opportunity to engage with evidence providers and with peers, such as through organising replacement for the decision makers in their regular responsibilities to free their time (Shippee et al., 2013[25]; Langer, Tripney and Gough, 2016[2]; O’Mara-Eves et al., 2013[14]).

This area entails promoting use of evidence as a principle of good decision making, developing a common understanding of what constitutes fit-for-purpose evidence and how evidence should be used for specific decision-making challenges. Recognising evidence use as a principle of good decision making underpins much of the demand for evidence. The aim is to stimulate behaviour and establish a positive attitude towards using evidence in daily practice. Raising awareness entails promoting visibility of the issue and educating decision makers about the importance and the benefits of using evidence. This may include events and awareness initiatives, providing consulting services, and raising awareness within regular work meetings. It also includes providing guidance and assistance on how to raise awareness, for example through information material available to school leaders (Johnson and May, 2015[26]; Langer, Tripney and Gough, 2016[2]).

Promoting agreement on what constitutes fit-for-purpose evidence entails developing a common understanding of the requirements on evidence for it to be useful for particular decisions and challenges. An agreement on what constitutes fit-for-purpose evidence can increase efficiency of using evidence and support efficient exchanges between evidence providers and decision makers. Formalising consensus on fit-for-purpose evidence can bolster the commitment to the standards and expectations when, where and how the evidence should be used (Diamond et al., 2014[27]).

Developing common understanding of how evidence should be used increases efficiency of using evidence in decision-making and promotes systematic use of evidence. In policy making, this includes developing how evidence should be used and which role it should take in preparing, implementing and evaluating measures. In schools, standards for evidence use may pertain to school development, curriculum development, as well as using evidence in staff development decisions or for organisational measures (Paine Cronin and Sadan, 2015[28]).


[8] Abdullah, G. et al. (2014), “Measuring the effectiveness of mentoring as a knowledge translation intervention for implementing empirical evidence: A systematic review”, Worldviews on Evidence-Based Nursing, Vol. 11/5, pp. 284-300, http://dx.doi.org/10.1111/wvn.12060.

[15] Bunn, F. and K. Sworn (2011), “Strategies to promote the impact of systematic reviews on healthcare policy: A systematic review of the literature”, Evidence & Policy: A Journal of Research, Debate and Practice, Vol. 7/4, pp. 403-428, http://dx.doi.org/10.1332/174426411X603434.

[1] Burns, T., F. Köster and M. Fuster (2016), Education Governance in Action: Lessons from Case Studies, Educational Research and Innovation, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264262829-en.

[9] Chambers, D. et al. (2011), “Maximizing the impact of systematic reviews in health care decision making: A systematic review of knowledge-translation resources”, The Milbank Quarterly, Vol. 89/1, pp. 131-156.

[11] Cordingley, P. (2016), “Knowledge and research use in local capacity building”, in Governing Education in a Complex World, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264255364-9-en.

[27] Diamond, I. et al. (2014), “Defining consensus: A systematic review recommends methodologic criteria for reporting of Delphi studies”, Journal of Clinical Epidemiology, Vol. 67/4, pp. 401-409, http://dx.doi.org/10.1016/j.jclinepi.2013.12.002.

[5] Fahey, G. and F. Köster (2019), “Means, ends and meaning in accountability for strategic education governance”, OECD Education Working Papers, No. 204, OECD Publishing, Paris, https://dx.doi.org/10.1787/1d516b5c-en.

[10] Gray, M. et al. (2012), “Implementing evidence-based practice”, Research on Social Work Practice, Vol. 23/2, pp. 157-166, http://dx.doi.org/10.1177/1049731512467072.

[23] Hargreaves, D. (2011), Leading a self-improving school system, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/325890/leading-a-self-improving-school-system.pdf.

[22] Harris, J. et al. (2010), “Are journal clubs effective in supporting evidence-based decision making? A systematic review, BEME Guide No. 16.”, Medical Teacher, Vol. 33/1, pp. 9-23, http://dx.doi.org/10.3109/0142159x.2011.530321.

[16] Harvey, G. et al. (2002), “Getting evidence into practice: The role and function of facilitation”, Journal of Advanced Nursing, Vol. 37/6, pp. 577-588, http://dx.doi.org/10.1046/j.1365-2648.2002.02126.x.

[3] Hess, C. and E. Ostrom (eds.) (2007), Understanding Knowledge as a Commons - From Theory to Practice, MIT Press, Cambridge, London, http://mitpress.mit.edu.

[7] Hyde, C. et al. (2000), Systematic Review of Effectiveness of Teaching Critical Appraisal, ICRF/NHS Centre for Statistics in Medicine, Institute of Health Sciences.

[26] Johnson, M. and C. May (2015), “Promoting professional behaviour change in healthcare: What interventions work, and why? A theory-led overview of systematic reviews”, BMJ Open, Vol. 5/9, p. e008592, http://dx.doi.org/10.1136/bmjopen-2015-008592.

[21] Kim, D. et al. (2015), “Social network targeting to maximise population behaviour change: A cluster randomised controlled trial”, The Lancet, Vol. 386/9989, pp. 145-153, http://dx.doi.org/10.1016/S0140-6736(15)60095-2.

[20] Kothari, A., S. Birch and C. Charles (2005), ““Interaction” and research utilisation in health policies and programs: does it work?”, Health Policy, Vol. 71/1, pp. 117-125, http://dx.doi.org/10.1016/j.healthpol.2004.03.010.

[13] Kreuter, M. and R. Wray (2003), “Tailored and targeted health communication: Strategies for enhancing information relevance”, American Journal of Health Behavior, Vol. 27/1, pp. 227-232, http://dx.doi.org/10.5993/ajhb.27.1.s3.6.

[2] Langer, L., J. Tripney and D. Gough (2016), The Science of Using Science - Researching the Use of Research Evidence in Decision-Making, EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London, http://eppi.ioe.ac.uk/cms/Default. (accessed on 15 January 2018).

[18] Malaga, G. (ed.) (2014), “Do interventions designed to support shared decision-making reduce health inequalities? A systematic review and meta-analysis”, PLoS ONE, Vol. 9/4, p. e94670, http://dx.doi.org/10.1371/journal.pone.0094670.

[6] Michie, S., M. van Stralen and R. West (2011), “The behaviour change wheel: A new method for characterising and designing behaviour change interventions”, http://dx.doi.org/10.1186/1748-5908-6-42.

[4] Newman, J. and B. Head (2017), “Wicked tendencies in policy problems: Rethinking the distinction between social and technical problems”, Policy and Society, http://dx.doi.org/10.1080/14494035.2017.1361635.

[12] Noar, S., C. Benac and M. Harris (2007), “Does tailoring matter? Meta-analytic review of tailored print health behavior change interventions.”, Psychological Bulletin, Vol. 133/4, pp. 673-693, http://dx.doi.org/10.1037/0033-2909.133.4.673.

[17] Nutley, S., I. Walter and H. Davies (2007), Using Evidence: How Research Informs Public Services, Policy Press, Bristol, UK.

[14] O’Mara-Eves, A. et al. (2013), “Community engagement to reduce inequalities in health: A systematic review, meta-analysis and economic analysis”, Public Health Research, Vol. 1/4, pp. 1-526, http://dx.doi.org/10.3310/phr01040.

[28] Paine Cronin, G. and M. Sadan (2015), “Use of evidence in policy making in South Africa: An exploratory study of attitudes of senior government officials”, African Evaluation Journal, Vol. 3/1, http://dx.doi.org/10.4102/aej.v3i1.145.

[19] Quinn, E. et al. (2014), “How can knowledge exchange portals assist in knowledge management for evidence-informed decision making in public health?”, BMC Public Health, Vol. 14/1, http://dx.doi.org/10.1186/1471-2458-14-443.

[24] Sebba, J., P. Kent and J. Tregenza (2012), What Does the Evidence Suggest are Effective Approaches? Schools and Academies Resource, http://www.badscience.net/.

[25] Shippee, N. et al. (2013), “Patient and service user engagement in research: A systematic review and synthesized framework”, Health Expectations, Vol. 18/5, pp. 1151-1166, http://dx.doi.org/10.1111/hex.12090.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2020

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at http://www.oecd.org/termsandconditions.