2. Building effective connections to support the use of evidence
This chapter examines Lithuanian senior civil servants and policy makers’ ability to use evidence and evaluations. It notes that even if the COVID-19 pandemic has increased interest in scientific evidence, its use is still overall limited in decision making. The report suggests that Lithuania should invest in decision makers’ skills to use, review and appreciate evidence. The report also suggests that publicity and communication of evaluations and evidence are insufficient to ensure their impact. Finally, the chapter analyses several key evidence generating processes in the Lithuanian government, such as that for the evaluation EU structural funds, for strategic planning and for performance audits by the Supreme Audit Institution. In addition, it suggests that the establishment of a government-wide framework for policy evaluation could create further systematic connections between evidence and decision making additional marketplace for evidence.
Supply of evidence is not a sufficient condition for use: demand from primary intended users also needs to be there. Both research and practice indicate that despite the extensive production, communication and dissemination of policy analysis, use of evidence by decision makers remains limited for a variety of reasons. Specifically, evidence users – policy makers, in particular – can face challenges related to their lack of competence to analyse and interpret evidence (Results for America, 2017[1]), meaning that they do not have the appropriate skills, knowledge, experience and abilities to use evaluation results. Other factors, such as environmental pressures to use evidence, can also influence the extent to which there is a demand for evidence for decision making. Governments may also put in place processes in order to promote the systematic use of evidence for decision making.
In Lithuania, demand for evidence and analysis remains an important challenge, in spite of recent heightened interest in scientific research in the context of the COVID-19 crisis. Yet, demand is paramount to use and to effectively embedding evidence in policy-making processes. In this context, this report highlights the role of demand in promoting use of evidence for decision making. It examines the role of the COVID-19 pandemic in accelerating demand for analysis and scientific evidence and suggests that efforts should be made to make this heightened demand more enduring. Second, the chapter looks at the role of skills in creating demand from decision makers for evaluations and evidence, and argues that any investment in analytical skills should go hand in hand with an investment in public sector managers and policy makers’ skills to actually use evidence. Finally, the chapter analyses the different ways in which the Lithuanian government could promote more systematic connections between supply and demand for evidence, either through increased publicity and communication of results, or through policy frameworks that embed use of evidence in key decision-making processes.
Use is crucial for impact
Effective use of evidence and evaluations is key to embedding them in policy and decision-making processes. Without use of evidence, gaps will remain between what is known to be effective and decision making in practice. Moreover, as policy makers invest public funds in supplying evidence in the hopes to improve policies and programmes and provide useful insights on public issues, its use is key. On the other hand, underuse of evidence can jeopardise the evidence-informed decision making agenda. When decision makers ignore the results of evaluations, for instance, future calls for evaluation may be undermined and evaluations or regulatory impact assessments become check-the-box exercises.
The notion of use of evidence can have multiple meanings. Literature on evaluation and evidence identifies three main types of uses (Ledermann, 2012[2]):
Symbolic use (also known as persuasive) occurs when the results of evaluations are taken up to justify or legitimise a pre-existing position, without changing it. Examples of this are when ministers use evaluations to justify their policy choices or when congressional members use findings from an evaluation in order to push for a proposition of law.
Conceptual use happens when evaluation results lead to an improved understanding or a change in the conception of the subject of evaluation. An example of this is the identification of collateral impact of a policy or reverse causation.
Instrumental use is when evaluation recommendations inform decision making and lead actual change in the policy being evaluated. An example of this is the reallocation of funds after a poor performance.
More importantly, users of evidence include not only decision makers, for whom conceptual and instrumental use are key, but also civil servants, experts and practitioners (local authorities, programme managers, health practitioners, etc.), who are looking for increased accountability, learning and better strategic decision making. Citizens and stakeholders are also users of evidence by making policy makers accountable. Evidence can be used to improve regulations, inform resource allocations on the ground or monitor the implementation of policies, etc.
Achieving use remains a constant challenge across countries
Regardless of these many potential users, use of evidence remains a constant challenge for OECD countries, and often falls under expectations. Indeed, even in countries where the supply of evidence is stable and supported by adequate capacities, an effective connection between supply and demand for evidence remains elusive (OECD, 2020[3]). For example, the United States estimates show that under the two Obama administrations, only 1% of government funding was informed by evidence (Bridgeland and Orszag, 2013[4]). In the United Kingdom, there are also concerns about use of evidence by government: a National Audit Office report on government evaluations found that there is little information on how the government has used the evaluation evidence that it had commissioned or produced (NAO, 2013[5]). A similar study in Australia found that although public servants seem to have good access to academic research, they are not using it systematically for policy advice (Newman, Cherney and Head, 2017[6]).
Furthermore, while many factors contribute to evaluation use, the specific barriers to evidence use vary depending on the context. Use of evaluation is, in many ways, “more of an art than a science” (Results for America, 2017[1]). Still, there are several ways in which governments can promote the use of evaluations, in particular by:
Increasing civil servants and policy makers’ demand for evaluations, specifically through competency development.
Supporting the uptake of evaluations results by granting access to evidence and communicating results strategically.
Institutionalising use, by embedded use of evidence in processes and frameworks.
In Lithuania, the overall demand for evidence remains uneven. However, as in many other OECD countries, the COVID-19 pandemic has created strong impetus for use of scientific advice and data analysis.
The COVID-19 pandemic was a strong catalyst in increasing the demand for evidence by the decision makers
The response to the COVID-19 crisis has provided a good example of how political and societal interest can strengthen use of evidence. At the onset of the pandemic, a new system of health data sharing was quickly established in Lithuania, and data was presented to the government on a daily basis regarding hospital and Intensive Care Unit bed occupation rates. The State Patients Fund also created a questionnaire that each health care institution had to answer to track medical equipment and bed occupation rate. A platform was created by the Office of the Government in order to update this health data, as well as other economic indicators, on a frequent basis. The data was then discussed at the level of the Office of the Government.
The wider scientific community was also successfully mobilised to contribute to an evidence-informed crisis management. For instance, the Lithuanian Research Council approved 29 research projects related to the pandemic. These research projects are conducted by universities and research institutes (see Table 2.1), at the Research Council’s request, based on an accelerated procurement scheme.
Strengthening a futures approach
Other positive signs of growing interest in the use of evidence and strategic foresight can be observed among the political leadership in Lithuania. In December 2020, the Parliament of Lithuania has established the “Committee for the Future” that aims to discuss long-term societal and governmental issues and trends (Parliament of Lithuania, 1994[7]). Migration and demographics as well as changes in technology and innovations are some of the topics that fall in the domain of the mandate of this new committee. The work of such a committee could be enhanced and its discussions substantiated by strategic foresight exercises conducted in the context of the development of strategic planning documents. The incorporation of evidence-based parliamentary discussions into the elaboration of strategic governmental plans would contribute to forging a consensus on a long-term national strategy. Meaningful deliberation of such long-term strategic state issues indeed requires the development of public sector capacities to gather, analyse and use evidence to feed into committee debates. These different initiatives reflect significant agility and capacity to adapt in the public sector – but most importantly are good examples of strong demand for data and evidence due to specific circumstances.
Beyond the COVID-19 case, demand for evidence remains hindered by institutional capacities and processes
Decision makers in Lithuania generally appear to be keen to use evidence for policy design, but do not always know where or how to find the data that they need. Other factors that limit the capacities for demand include strict procurement processes. For instance, members of Parliament can commission independent evaluations when a law proposes substantial changes to an existing regulation, and at least 20% of members of Parliament support the initiative. This procedure, however, is rarely used due to strict procurement rules (Parliament of Lithuania, 1994[7]) and very low payment for these services. In May 2021,the Statute of Parliament was amended to offer more flexibility to this procurement procedure (Parliament of Lithuania, 2021[8]), in order to provide greater flexibility. Still, challenges in this regard remain.
Other times, demand for evidence can be low when evaluations and analysis are perceived as formal obligations, rather than key tools for better decision-making. This can be the case for Regulatory Impact Assessments (see chapter three for more information on these challenges). Indeed, the civil service in Lithuania often seems to emphasise the tasks related to the preparation and implementation of laws rather than their analysis. The civil service law of 1999 (Parliament of Lithuania, 1999[9]) and the law on public administration of 1999 (Parliament of Lithuania, 1999[10]) mostly focus the civil service on skills related to policy implementation rather than to policy-making. However, a full understanding of the Policy Making Cycle is necessary from a good governance perspective (see the Box 2.1 below).
The policy making cycle (Figure 2.1) is a general concept used to frame the policy making process as a continuous and virtuous learning cycle, as presented in the figure below. However, in practice the policy making the policy making process is not always linear nor cyclical as it is also impacted by values, beliefs, political conflicts and priorities.
Often times though, as in most OECD countries, low demand for evidence can be related to decision makers’ lack of competency to analyse and interpret evidence (Results for America, 2017[1]), meaning that they do not have the appropriate skills, knowledge, experience and abilities to use evaluation results (Stevahn et al., 2005[12]) (American Evaluation Association, 2015[13]) (Newman, Fisher and Shaxson, 2012[14]).
A significant investment in skills will be needed to improve decision-makers’ skills to obtain, assess, use and apply evidence
Stimulating demand for evidence requires behaviour changes from decision makers, which are unlikely to be achieved exclusively through upskilling and training. This does not mean, however, that training initiatives cannot improve individual skills to use research.
The first step in improving skills for use, and thus promoting demand for evidence, is to understand what these skills entail. The OECD together with the European Joint Research Centre (JRC) has identified 6 clusters of skills that the public sector should aim at developing in order to employ the evidence throughout the policy cycle. Box 2.2 expands on this skillset and provides definitions for each cluster of skills based on this joint OECD-JRC work.
This skill-set is defined as a collective skill-set for the improvement of public service in the future and not as a full list of skills that each public servant needs to master. This skillset does not apply to one scenario; instead, it is of a cross-cutting character and can be applied on multiple occasions. It includes elements like critical thinking, systems thinking, and engaging with stakeholders.
Understanding EIPM – understanding of the policy cycle and knowing how evidence could be employed in each of its component. It has to be underpinned by the familiarity with the fundamental methods in research and statistics.
Obtaining Evidence – ability to recognise and measure the existing stock of evidence in the relevant policy area and identify the evidence gaps to commission high quality studies.
Interrogating and Assessing Evidence – ability to assess the provenance, reliability and appropriateness of evidence by using systemic, holistic and critical thinking tools free of personal bias.
Using and Applying Evidence in Policy-Making – deep knowledge of the policy area and understanding how different evidence, research and innovative approach can be used to support policy design and implementation.
Engaging with Stakeholders in EIPM – strong engagement and communications skills. Ability to engage various groups of stakeholders in a discussion and to communicate policy messages effectively.
Evaluating the Success of EIPM – ability to use different evaluation approaches to inform and improve EIPM processes and policy cycle.
OECD country practices aimed at improving these skills reveal a wide range and approaches towards skills development interventions. The work by the OECD on Building Capacity for Evidence Informed Policy Making (OECD, 2020[3]) suggests that training for Senior Civil Service leadership should be aimed at increasing managers’ understanding of evidence informed policy making and policy evaluation, enabling them to become champions for evidence use. In Canada, for example, the executive training in research application (EXTRA) programme provides support and development for leaders in using research. Intensive skills training programmes aimed at policy makers may be more focused on interrogating and assessing evidence and on using and applying it in policy making.
The Lithuanian government could thus consider organising training for Senior Civil Service leadership. This could be done in the context of the leadership programmes given by the Institute of Public Administration. or through commissioning specialised external institutions. Such trainings can take the form of workshops, masterclasses or seminars. They could also build up on the existing OECD work on the skills for public sector innovation and civil service leadership (see Box 2.3 below). These skills are indispensable for the institutional and cultural transformation that is necessary to foster the demand for evidence at the senior civil service and political levels.
Skills for public sector innovation
Basing policy decisions on evidence and public consultations is a relatively innovative and demanding approach necessitating a high degree of organisational transformation within the public sector. The OECD, in co-operation with NESTA (National Endowment for Science, Technology and the Arts), has developed a framework of skills for public sector innovation. The six core skills included in the framework are the following:
Iteration – policies and products developed experimentally and incrementally.
Data Literacy – ensuring that data is not used for ex post substantiation but to inform the decision.
User Centricity – ensuring that the public sector solves the need of the public and users.
Storytelling – explaining ideas in a way that brings a change.
Insurgency – challenging the status quo and working with unusual partners.
The OECD research acknowledges that apart from the skills, there are important capacities that are necessary for the innovation-embracing behaviour in the public sector. The right mindset and attitude of the leadership, organisational culture and corporate systems constitute the institutional capacity to utilise the core skills for public sector innovation.
Capabilities for Public Sector Leadership
The OECD has also developed a framework of core capabilities for senior civil service leadership. Civil service leadership is the capacity of a civil servant to achieve objectives of a government, through collaboration with others. Leadership requires a set of four core capabilities:
Values-based Leadership: the presence of wicked and complex policy issues means that the consideration of conflicting values and interests of different stakeholder groups needs to guide decision making.
Open Inclusion: effective leaders seek to challenge their views through consultations with various stakeholders. They need to know how to make the alternative voices feel comfortable to share their concerns and suggestions.
Organisational Stewardship: senior civil servants need to reinforce a trust- and values-based culture and equip their workforce with the right skills, tools and working environments. Such managers can align the values of team members to motivate them to achieve a common goal.
Networked Collaboration: effective civil service leaders need to be adept to forge and manage collaboration between different governmental institutions and beyond the public sector. In such a working environment, relationships become the currency of public managers.
The framework recognises that leaders excelling in all four fields of competence might still be obstructed from being effective leaders because of the hindrances from the operational and institutional environment.
Source: OECD (2017[15]), Core Skills for Public Sector Innovation, https://www.oecd.org/media/oecdorg/satellitesites/opsi/contents/files/OECD_OPSI-core_skills_for_public_sector_innovation-201704.pdf (accessed on 24 May 2021); Gerson (2020[16]) Leadership for a High for a High-Performing Civil Service: Towards Senior Civil Service Systems in OECD countries, https://dx.doi.org/10.1787/ed8235c8-en.
Policy makers and stakeholders cannot use evidence and the results of evaluation if they do not know about it (Haynes et al., 2018[17]). The first step to promoting use is therefore that the results be made available to their intended users – simply put, that they be made public. But publicity alone is not enough, and active communication and dissemination strategies are needed to promote use.
Publicity creates incentives for use of evidence
There is a lack of systematic publication of analysis produced by public institutions
Making evidence public is an important element to ensuring its impact: if citizens are aware of evidence, it may build pressure on policy makers to use it (OECD, 2020[3]). Indeed, the publicity of policy advice as well as evaluation is important in order to ensure that the public trusts the government to not “cherry-pick” the evidence produced by advisory bodies and evidence-generating systems (OECD, 2017[18]). Therefore, many countries have instituted policies mandating the publication of policy advice documents as well as to enhance the easy public access to these materials.
In particular, OECD data shows that policy evaluation results are increasingly made public by countries (see the Figure 2.3 below) (OECD, 2020[19]). Only one surveyed country reported that evaluation results are only available for selected officials on an ad hoc basis, while 16 OECD countries, make evaluation findings and recommendations available to the general public by default – for example by publishing the reports on the commissioning institutions’ website.
In Lithuania, evaluation results are generally made available across government, but not necessarily to the general public. Indeed, the publication and dissemination of evaluations remain fragmented. Most of the analytical work produced by the ministries is published on their respective websites. However, no centralised platform or a government-wide searchable portal of analytical materials exist, nor are there clear government-wide guidelines on what has to be published and with what delays.
While evaluations of the EU structural funds have to be made public on the centralised esinvesticijos.lt platform, the new framework of spending reviews, as detailed in the recently adopted strategic governance methodology (Government of Lithuania, 2021[20]), foresees the publication on the website of the Ministry of Finance only if the recommendation of a report were accepted for implementation. Similarly, the legislative framework mandates that ministries make all legislative projects public on a specific platform (the TAIS, Legal Acts Information System) (article 5 of the Law on the Legislative Framework of 2012 (Parliament of Lithuania, 2012[21])). However, the regulatory impact assessment that must accompany every legal act itself is not always published, and only its results have to be made public, whereas the good practice would be to publish such an impact assessment, as is done at the European level, or in countries such as Canada.
Therefore, a first step in ensuring that evidence is used in policy making in Lithuania would be to ensure facilitated and standardised access to evaluations, together with systematic use of executive summaries drafted in plain language. Indeed, the first step to promoting use is therefore making the results available to their intended users and stakeholders, either through the individual websites of the commissioning institutions or through a centralised database, such as the one currently used for the evaluations of structural funds.
A centralised and searchable database for evaluations and policy analysis can facilitate use
STRATA has recently developed a library of evaluations conducted by public sector institutions. It centralises evaluations of EU structural funds, performance audits by the Supreme Audit Institution, and sectoral evaluations and studies.1 The visibility and accessibility of this library should be enhanced by its foreseen inclusion onto the platform of the National Martynas Mažvydas Library.
While this is a highly laudable initiative, research suggests that ease of access is also an important factor in promoting use of evidence (Haynes et al., 2018[17]). For this reason, STRATA’s evaluation repository should be transformed into an easy-to-use database, which could be hosted on the platform of the national library, that would make it easy to sort through the material based on the type of analytical material (e.g. evaluation of structural funds, regulatory impact assessment, ex post evaluation, etc.) and the institution that conducted the study. Publications could also be made systematic and automated so that all the studies and evaluation can be found there as they are published. In doing so, the Lithuanian government could draw inspiration from the centralised portal for evaluations set up by the Directorate for Financial Management and the National Library of Norway (see Box 2.4 for more information on this database).
In Norway, the Directorate for Financial Management and the National Library of Norway maintain and manage a centralised evaluations portal (https://evalueringsportalen.no/). All the studies and evaluations are made available on the portal as soon as published. Moreover, they are easily searchable and categorised. One can search based on topic, commissioning institution, conducting institution, type of evaluation (ex post evaluation, socio-economic analysis, etc.) or based on the underlying method of the study (based on questionnaires, public datasets, literature review). The portal contains the studies conducted since 2005 by the government and agencies as well as some selected earlier governmental studies. Finally, on the portal one can find various evaluation guidelines as well as evaluation agendas, relevant professional and news publications.
Such a centralised platform helps to build and enable the reuse of knowledge. Moreover, since it is easily searchable and updated by default it increases the transparency of public sector analysis.
Source: OECD (2020[19]), Improving Governance with Policy Evaluation: Lessons from Country Experience, https://doi.org/10.1787/89b1577d-en.
Communication and dissemination are needed to increase awareness and impact
Some efforts are made to communicate evidence to a wider public
While a useful first step in promoting access to the evidence, publicity is not enough. Indeed, research suggests that in isolation, publicity alone does not significantly improve uptake of evaluations in policy making (Langer, Tripney and Gough, 2016[22]; Dobbins et al., 2009[23]; Haynes et al., 2018[17]). Rather, the presentation of evidence should be strategic and driven by the evaluation’s purpose and the information needs of intended users (Patton, 1978[24]). As such, evaluation results ought to be well synthesised and tailored for specific users for their use to be facilitated.
In Lithuania, a handful of institutions do practice more innovative and targeted communication strategies. For instance, some institutions have published reports tailored to a wider audience. The Bank of Lithuania periodically publishes a Working Paper series, a Discussion Paper Series as well as an Occasional Paper Series, which are aimed at stakeholder, academic and policy communities, as well as a wider interested public. Another good practice is the Budget at a Glance reports produced by the Ministry of Finance that summarises the state budget’s composition and budgetary allocations in a concise manner that is understandable and interesting for a non-professional reader.
The Lithuanian Supreme Audit Institution also has a dedicated communication division that is charged with the development of communication channels and tools. Indeed, all recommendations from the performance audits produced since 2014 can be retrieved from the SAI’s open data portal (https://www.vkontrole.lt/atviri_duomenys_rekomendacijos.aspx). These recommendations are sorted out thematically, and visuals inform readers of their implementation status. Hence, a user can find all the recommendations made in a field of interest without having to look for each specific report. Moreover, in some cases, conferences to present audit results to stakeholders and the wider public are organised for the wider public.
However, few institutions have such dedicated communication units. Generally speaking, Lithuanian public institutions have limited experience in effectively communicating their analytical work. In addition, there are few established channels to disseminate the results and insights within the public sector and to the wider public.
Developing tailored communication and dissemination strategies
Thus, public institutions in Lithuania could develop tailored communication and dissemination strategies that increase access to clearly presented research findings are very important for use. These strategies can include use of infographics, tailored synthesis of research evidence, for example in the form of executive summaries, dissemination of ‘information nuggets’ through social media, seminars to present research findings, etc. (OECD, 2020[19]). In other OECD countries, Canada for example, departments are diffusing evaluation findings beyond departmental websites via such platforms as Twitter and LinkedIn.
Formal organisations, institutional mechanisms and processes set-up a foundation for evidence-informed policy making that can withstand transitions between leadership (Results for America, 2017[1]). Indeed, use of evidence in policy and decision making is intimately linked to institutional structures and systems, insofar as they create a fertile ground for supply and demand of evidence to meet. Such institutionalisation can be defined as the systematic process of embedding evidence-informed practices into more formal and systematic approaches (Gaarder and Briceño, 2010[25]). These mechanisms can be found either at the level of specific institutions, such as management response mechanisms, or within the wider policy cycle, such as through the incorporation of policy evaluation findings into the budget or regulatory cycle or discussions of evidence in strategic planning (OECD, 2020[19]). In Lithuania, some of the frameworks for these mechanisms are well established or currently undergoing significant reforms. Nevertheless, others still need to be improved in order to support more systematic supply and use of evidence for decision making.
The evaluations of structural funds are framed by requirements related to EU integration
Evidence from evaluations of structural funds is generated in a systematic manner in Lithuania as required by EU law
All programmes and projects funded by EU structural funds have to be evaluated. Indeed, the article 53 of the Common Provision Regulation for 2014-2020 (European Parliament, 2013[26]) mandates that each operational programme must be evaluated by functionally independent and preferably external evaluators (not from the public institution that manages the programme).
In Lithuania, the evaluations of structural funds are co-ordinated by the Ministry of Finance, which prepares the annual evaluation plans for the national operational programme, sets evaluation standards, organises capacity-building activities, provides methodological support and organises events to diffuse evaluation results. As detailed in the section on publicity above, these evaluations are systematically published in a centralised and user-friendly portal esinvesticijos.lt. The overall system for the evaluation of structural funds is represented in Figure 2.4.
The CPR for 2014-2020 mandates the creation of a “Monitoring Committee” that is made up of representatives of public institutions, non-governmental organisations and other economic and social partners. This committee has to approve the evaluation plan and the annual evaluation plans, discusses the evaluations and their recommendation and, hence, ensures the quality of the evaluations and the use of the results (Government of Lithuania, 2014[27]).
As a result, evidence from evaluations of structural funds is generated in a systematic manner in Lithuania. Thus, there are established channels for the dissemination of results of the evaluation of structural funds, as well as for their use in policy making. A meta-evaluation conducted in 2013 and 2015 found that in the evaluations that were conducted in the financing period of 2007-2013, 90% of recommendations were accepted for implementation and 70% have been implemented or will be implemented (Ministry of Finance, 2017[28]).
According to other studies carried out in recent years, these evaluation results are used not only in the improvement of investment programmes, but also in the development of strategic planning documents. For instance, results from the Evaluation of financing of the Lithuanian economic sectors: post 2020, which looked at public interventions in individual public policy areas aimed at ensuring sustainable growth and quality of life in the long-term, were widely used and discussed for the preparation of the National Progress Plan of 2030 (PricewaterhouseCoopers and ESTEP Vilnius, 2019[29]) (Government of Lithuania, 2020[30]).
The impact of these evaluations on government-wide capacities is unequal
While the evaluations of structural funds and their use reflect the implementation of best practices, they did not lead to significant creation of capacity for analysis within the administration. The fact that external evaluators conduct the evaluations tends to reduce the possibility for positive spill-overs to create internal government capacity and to foster a broader evidence based decision making culture across ministries. As has been detailed in other chapters, the capacities are still missing for effective RIAs or ex post evaluations. These frameworks seem to benefit only marginally from the elaborated system of the EU structural funds evaluations.
The framework for strategic planning and monitoring of strategies is undergoing significant reform
The strategic governance system in Lithuania is well institutionalised but remains complex
Strategic planning and monitoring create opportunities to generate and use evidence identifying long-term trends, learning from previous planning cycles and identifying implementation gaps. The planning system in Lithuania is institutionalised, through a network of strategic-planning units within each ministry as well as a governmental strategic committee. However, the strategic-planning system as it functioned until recently was very complex: about 250 strategic documents exist, while the strategic action plans include 1 800 monitoring indicators in total (Nakrosis, Vilpisauskas and Detlef, 2019[31]), and recent efforts for streamlining that have had only a partial effect to date.
The ongoing reform of the strategic governance system aims at contributing to evidence based nature of this framework encouraging a more forward-looking perspective on policy making (Parliament of Lithuania, 2020[32]) by:
developing the main planning documents (the National Progress Plan and the State Progress Strategy) based on a foresight analysis conducted by STRATA. The law of Strategic Governance stipulates that STRATA conducts “environment analysis” or – horizon scanning (Articles 13-2 and 15-2) (Parliament of Lithuania, 2020[32]).
linking the budget to these strategies, by including evidence on the performance of each ministry in the implementation of these plans in the budget.
The main long-term strategic document in the new system will be State Progress Strategy for 2050, which is to be ratified by the Parliament, the main medium-term planning document is going to be the National Progress Plan for 2030. The NPP will be implemented through 28 sectorial Development Programmes and additional (institutional) Strategic Activity Plans. In addition, the government programme remains the main political document which expressed party consensus for the duration of the mandate and is supposed to provide impulse to ministerial actions. The following Figure 2.5 provides an illustration of this new system for strategic planning.
Thus, the implementation of this reform would reduce the number of strategic-planning documents from 290 to 100 (Office of Government, 2020[33]). Still, many types of strategic-planning document would remain:
11 to 12 strategic documents: 2 strategies, one concept, 2 or 3 agendas and 4 plans of agendas.
up to 56 planning documents with the financial resources planning: Government Programme and its implementation plan and up to 28 National Development Programmes.
130 lower-level planning documents: 10 regional development plans, 60 municipal development plans, 60 municipal general plans (Office of Government, 2020[33]).
It seems that, while desirable, this reform will remain an incomplete agenda, with still too many remaining planning documents and strategic priorities.
The recent strategic planning documents adopted by the new government have taken note of the importance of prioritisation and included a limited number of over-arching principles and objectives that should be the focus of the whole government. Indeed:
The National Progress Plan 2021-2030 includes 10 key strategic objectives (Government of Lithuania, 2020[30]).
The government programme focuses on 12 priorities that are aligned with the NPP.
The implementation of the government’s programme includes 4 levels of priority projects (Government of Lithuania, 2021[34]):2 5 strategic reforms (civil service reform, development programme “schools of the millennium”, digital transformation of education “EDtech”, the development of innovation ecosystems in educational centres and development of innovation agency and programs for business and science innovations based on missions), 7 strategic projects of prime minister’s portfolio, 11 strategic projects included in the portfolio of ministers but related to the 3 horizontal priorities of the government programme implementation plan (green course, digitalisation and inequality reduction), and other strategic projects implemented on the ministerial level.
Indeed, good international practice suggests that there should only be a limited number of objectives in order to focus and mobilise resources for their achievement. Conversely, too many objectives will scatter scarce resources and lead to unfocused delivery of policies and reforms (OECD, 2018[35]). Prioritisation is necessary to ensure that the strategy is realistic and can be implemented with the state’s existing resources. Another important challenge is that the duality between the Government Programme, and other planning instruments remains. The Scottish government, for example, has identified the main long-term aspirations for the country through three main national-level outcomes, which are monitored through 81 outcome-level indicators. These outcomes are easy to understand and high-level, and serve as a tools for the citizens to assess the government’s efforts to improve the country’s well-being.
The National Performance Framework of Scotland
The national performance framework of Scotland proposes a Purpose for the Scottish society to achieve. To help achieve this Purpose, the framework sets national outcomes that reflect the values and aspirations of the people of Scotland, are aligned with the United Nations Sustainable Development Goals and help to track progress in reducing inequality. These national outcomes include:
“We have a globally competitive, entrepreneurial, inclusive and sustainable economy”, in regards to the Scottish economy
“We respect, protect and fulfil human rights and live free from discrimination”, in regards to human rights
These National Outcomes are accompanied by a set of 81 outcome-level indicators, which updated on a regular basis to inform the government on how their administration is performing concerning the Framework. A data dashboard where citizens can access data on these indicators is available on the Scottish Government Equality Evidence Finder website.
Source: Government of Scotland (n.d.[36]), National Performance Framework, https://nationalperformance.gov.scot/, (accessed 25 June 2021).
A stronger focus on a limited number of objectives and better alignment with the government programme could support the effectiveness of monitoring and its use
In order to further strengthen the impact of this reform, Lithuania should consider focusing the strategic planning document on a limited (i.e. a dozen) number of impact and outcome-driven objectives, which seems to be the case with the latest version of the National Progress Plan, and to fully align it with the political programme. Indeed, good international practices suggest that there should only be a limited number of objectives in order to focus and mobilise resources for their achievement and to facilitate monitoring. Conversely, too many objectives will scatter scarce resources and lead to unfocused delivery of policies and reforms (OECD, 2018[35]). Prioritisation is necessary to ensuring that the strategy is realistic and can be implemented with the state’s existing resources. Similar reforms were carried out in Finland around 2015-16 following the OECD (2015) Finland and Estonia Public Governance Review (see Box 2.6 below). The fact that Lithuania has framed a strategic approach for the sustainability of its budgetary resources is an important step. This should establish not only clear linkages between objectives and priorities, but also a longer-term planning perspective and application of cost and benefit analysis to all measures and projects. The challenge will be to achieve the thematic concentration of resources to achieve the defined objectives, as to align strategic governance of the government programmes with resource prioritisation.
In 2011-2015, Finland had only 1 horizontal strategic document - the 4-year government programme. Nevertheless, this document included around 900 measures. The measures included in the programme were subsequently transformed into 140 projects in the government programme implementation plan (for the year 2011-2015). To add to the complexity of the government programme, line ministries would develop their own planning documents which were not always aligned with the government programme. Most importantly, the medium-term budgetary plans managed by the Ministry of Finance were disconnected from the government programme and often influenced line ministries’ planning documents more than the government programme’s implementation plan.
To alleviate the complexity of the government programme, Finland identified a handful of actionable policy objectives starting in 2011. For 2011-2015, these priorities were: i) prevention of poverty, inequality and social exclusion; ii) consolidation of public finances; iii) enhancement of sustainable economic growth, employment and competitiveness).
The current government has identified 5 key horizontal policy objectives: i) employment and competitiveness, ii) skills and education, iii) well-being and health, iv) the bio economy and clean solutions, v) digitalisation, experimentation and deregulation.
Furthermore, the Prime minister’s office (PMO) is in charge of monitoring the government programme. Specifically, the strategy unit in the PMO monitors the implementation of 5 key policy objectives of horizontal nature and wide structural reform of social and health care services that are part of Finland’s government-wide strategy. The key policy areas are monitored weekly at the level of the Centre of GovernmentGovernment in government strategy sessions reserved for situation awareness and analysis based on evidence and foresight. Milestones for each policy area and project are clearly defined and indicators for each strategy target are updated two to four times a year.
Source: OECD (2015[37]), OECD Public Governance Reviews: Estonia and Finland: Fostering Strategic Capacity across Governments and Digital Services across Borders, https://doi.org/10.1787/9789264229334-en; Government of Finland (n.d.[38]), Implementation of Government Programme, https://valtioneuvosto.fi/hallitusohjelman-toteutus/karkihankkeiden-toimintasuunnitelma (accessed on 25 June 2021).
Longer-term strategic planning documents of an aspirational nature can remain, inspired by foresight and seeking to chart a longer-term vision of the future. These will help to shape current choices and strategies but should not be subject to yearly monitoring. Periodic revisions of these strategic longer-term documents could be accompanied by structural analysis of progress and remaining challenges in the Lithuanian economy and society.
Spending Reviews are still seldom conducted and in ad-hoc manner but steps in formalising them are being taken
Spending reviews are collaborative processes aimed at identifying and adopting policy options by analysing the government’s existing expenditure within defined areas, and linking these options to the budget process (OECD, 2017[39]) (OECD, Forthcoming[40]). The purposes of a spending review are to:
Enable the government to manage the aggregate level of expenditure
Align expenditure according to the priorities of the government
The use of spending reviews has increased considerably among OECD countries since the aftermath of the global financial crisis. Indeed, spending reviews have proved to be an important tool for governments, not only to control the level of total expenditure by making space for more resources, but also to align spending allocations with government priorities and to improve effectiveness of policies and programmes.
OECD data shows that in 2020, a large majority of OECD countries report conducting spending reviews, either on an annual basis (20 countries) or periodically (11 countries). The total number of countries currently using spending reviews (31) has thus almost doubled compared to 2011, when only 16 countries were conducting this exercise (OECD, 2017[39]). As such, spending reviews are an important source of evidence to inform government activities as they build an understanding of what works in regards to public spending.
While the Lithuanian government has conducted spending reviews in the past, it has not done so on a systematic basis so far. The current Strategic Governance Reform led by the office of government thus includes the evaluation of current expenditures as part of the new strategic governance methodology. On the 28th of April 2021, the Lithuanian government approved the methodology, which details both how the investment spending included in the National Progress Plan should be monitored and evaluated, as well as how spending reviews should be applied for the evaluation of current expenditures (Government of Lithuania, 2021[20]). The Central Project Management Agency (CMPA) will thus be mandated to conduct spending reviews in co-operation with the Ministry of Finance and the Government Office (BGI Consulting, 2019[41]).
Every year the Ministry of Finance and the Government Office will agree on the spending review topics for the upcoming year. These spending reviews should serve as a basis for discussions of the national budget. However, these spending reviews cannot be done without the active participation and evaluative work of the ministries in charge of the areas under review. They require use of evaluations. This implies that fixing analytical capacities in the ministries will be a prerequisite for this reform to bear fruit, as examples of other countries show that spending review attempts can fail to produce results unless they are supported by capacities in line ministries.
Through this reform, Lithuania aims to incorporate the results of evaluations into the budgetary cycle. Incorporation of evaluation findings in the budgetary cycle is one of the most commonly used mechanism for the promotion of use of evidence. In fact, OECD data shows that half of surveyed countries report that they incorporate evaluation evidence into the budgetary cycle (OECD, 2020[19]).
The SAI performs performance audits, which achieve a certain degree of impact
In many OECD countries, Supreme Audit Institutions have responsibilities in conducting evaluations or performance audits (OECD, 2020[19]). The International Organisation of Supreme Audit Institution (INTOSAI) has published its guidelines for policy evaluations in which they reiterate that SAIs’ independence, methodological capacities and solid understanding of public policies give them an advantage in conducting policy evaluations (INTOSAI, 2016[42]). Nevertheless, Supreme Audit Institutions are external to government, and thus, their evaluations cannot replace the internal analysis conducted within government. The two methods of evaluations should be complementary: while external evaluations provide greater transparency and accountability, internal evaluations promote greater use of evidence (OECD, 2020[19]).
The Supreme Audit Institution of Lithuania does conduct performance audits that provide the Lithuanian governance with robust evidence regarding the performance of specific public policies. This process is well established as the SAI drafts its programme of audits based on the internal risk assessment system and the need to evaluate the efficiency, effectiveness of specific policies and programmes every year. The Parliament can also mandate the SAI to conduct an audit during a parliamentary plenary session.
Performance audits conducted by the Supreme Audit Institution are systematically discussed at the Audit Committee of Parliament, as well as in the relevant parliamentary committees. The results are rarely contested and seldom result in a public debate (Nakrosis, Vilpisauskas and Detlef, 2019[31]). Thus, the process through which the SAI conducts performance audits has promoted systematic linkages between supply and use of evidence – as shown by the fact that 82% of the recommendations made by the SAI in this context between 2010-2019 were implemented (Supreme Audit Institution, 2020[43]).
Lithuania should consider developing a government-wide framework for policy evaluation that could promote its use
Policy evaluation can be defined as a structured and objective assessment of a projected, planned, ongoing or completed policy, programme, regulation or reform initiative, including its design, implementation and results. Its aim is to determine the relevance and fulfilment of objectives, efficiency, effectiveness, impact and sustainability as well as the worth or significance of a policy intervention. The term “evaluation” can cover a range of practices, including – but not limited to – regulatory assessment, and can be embedded into various policy planning and policy-making processes. On the other hand, monitoring corresponds to a routinised process of evidence gathering and reporting to ensure that resources are adequately spent, outputs are successfully delivered, and milestones and targets are met (OECD, 2020[19]).
A policy or legal framework for evaluation across government helps to promote systematic linkages between supply and use of evaluations in decision making
Embedding policy evaluation in evidence-informed decision-making requires a legal or policy framework, insofar as such a framework provides a key legal basis for undertaking evaluations, guidance on when and how to carry them out. This can create systematic linkages between evaluations and key decision-making processes. Legal and policy frameworks may also formally determine the institutional actors, their mandates and the resources needed to oversee, carry out and use evaluations (OECD, 2020[19]).
Indeed, the institutionalisation of evaluation practices helps to ensure that siloed evaluation efforts are combined into a homogenous system of evaluations that enables the prioritisation and standardisation of methodologies, practices and quality (Gaarder and Briceño, 2010[25]). Specifically, adopting clear government-wide legal and policy frameworks for policy evaluation can help to:
clarify mandates and responsibilities regarding the promotion of policy evaluations, as well as their quality and use;
provide high-level guidance and clarity for institutions by outlining overarching best practices, goals and methods for policy evaluation.
Several paths exist for the legal institutionalisation of evaluation practices. As shown through the OECD (2018) survey, the need for policy evaluation is recognised at the highest level, with a large majority of countries having requirements for evaluation (23 countries), either in their primary and secondary legislation, or even in their constitution (OECD, 2020[19]). Moreover, about half of surveyed countries (17 OECD countries) have developed a policy framework for evaluation, document or set of documents that provides strategic direction, guiding principles and courses of action to the government for a specific sector or thematic area.
In Canada, for example, the Policy on Results, implemented under the aegis of the Treasury Board in Canada (See a Box 2.7 below), provides cross-government guidance for when and how to conduct policy evaluations, ensures the quality of evaluations and supports their use through systematic linkages with spending decisions.
In July 2016, the Government of Canada launched the Policy on Results, which complemented Canada’s Financial Administration Act requiring the evaluation of grants and contribution programmes every five years. The Policy on Results is managed by the Treasury Board of Canada and aims to clarify the objectives of government-funded programmes and the use of associated resources through evaluations. To this end, the Policy establishes a five-year evaluation schedule detailing the mandatory and discretionary evaluations to be conducted, as well as departments responsible for leading these evaluations.
The Treasury Board also promotes the use of the results of these evaluations in subsequent policy decisions. For example, evaluation results have to be submitted by department heads when they submit a proposal to the Treasury Board for new spending.
Under the Policy on Results, each government department is mandated to set an evaluation unit as well as a departmental results framework. The competencies of the Treasury Board of Canada as a steering body for policy evaluations include among others:
Ability to require departments to undertake specific evaluations and participate in centrally-led evaluations.
Approvals of the line ministries departmental results frameworks and any changes to their organisations’ core responsibilities.
The quality of the analysis is ensured through investment in capacity and skills, peer reviews of evaluations (both internal and external), and guidelines. Moreover, steering groups for both evaluations and performance measurement meet several times a year to discuss the challenges in conducting the analysis and using the resulting evidence. This system significantly contributed to promoting an understanding of the importance of evidence-informed decision making across government.
Source: OECD (2020[19]), Improving Governance with Policy Evaluation: Lessons from Country Experiences, https://doi.org/10.1787/89b1577d-en.
The Lithuanian government should consider adopting a clear government-wide framework for policy evaluation
Up until recently, there had not been any effort to co-ordinate policy evaluation across government in Lithuania. The methodologies for strategic governance (Government of Lithuania, 2021[20]) and ex post evaluation (Government of Lithuania, 2021[44]) establish cross-government frameworks for the evaluation of planning documents and for regulatory assessments (see chapter 3) respectively. The former mandates the evaluation of the National Progress Plan, which is the main 10-year planning document in Lithuania, as well as other annual and longer-term planning documents. The framework includes a 10-year evaluation plan for the National Progress Plan and for individual Development Programmes. However, this framework has only recently been adopted, and its effectiveness cannot be assessed at the time of writing this report.
Generally, Lithuania could consider adopting a whole of government framework for policy evaluation. Such a framework could specify not only the role of government institutions in regards to the promotion of policy evaluation, but also provide methodological guidance for their implementation, and include a process of long term and annual plans. In particular, the framework could attribute to the Office of the Government the mandate to promote and co-ordinate policy evaluation across government. STRATA could act also promote the quality of policy evaluations by developing guidelines for policy evaluation and supporting ministries in the implementation of these guidelines, for instance by conducting trainings.
Give the wider public access to evaluations by:
creating a one-stop-shop searchable web portal for government evaluations.
This website could also build upon the model of the portal used to publish the evaluations of EU Structural Funds, or the existing virtual library of STRATA.
New evaluations or analysis should be automatically uploaded as they are released on this portal.
The portal should include the possibility to search a study based on the type (RIA, ex post evaluation, spending review, and strategic foresight), owner institution, topic and keywords.
ensuring that evaluations are systematically accompanied by executive summaries and any relevant meta-data.
Improve the communication of evidence and evaluation through:
Create systematic feedback loops for evidence, by:
Holding systematic discussions on the highest political level as well as in parliament on the results of strategic foresight, impact assessments, evaluations and spending reviews produced by the governmental institutions.
Organising training and discussion sessions for higher-level civil servants and political decision makers on how to use academic research, and possibly on leadership and innovation skills.
Strengthen the role of strategic planning in the evidence-informed decision making system by:
Shaping a more forward-looking vision and building capacity for resilience to future shocks. Lithuania’s long-term strategic documents should be based on evidence-informed strategic foresight exercises and deliberation in parliament. These documents should also be of an aspirational nature steering the whole-of-government effort but not representing a significant monitoring burden and rigidity.
Further reducing the number of strategic and planning documents, by ensuring that the implementation of the new strategic governance framework results in a genuine reduction in the complexity.
Identifying a small number of key objectives to monitor for the whole-of-government. The actionable policy priorities would help the government to monitor the progress better and to ensure that the priorities are not cherry-picked by the administration.
Develop a government-wide policy framework for evaluation which:
Clarifies the role of government institutions in regards to the promotion of policy evaluation.
Gives the Office of the Government the role to promote co-ordinate policy evaluation across government.
Gives STRATA the mandate and applies its capacity to promote the quality of policy evaluations by developing guidelines for policy evaluation and supporting ministries in the implementation of these guidelines.
References
[13] American Evaluation Association (2015), Core Evaluator Competencies, http://www.eval.org (accessed on 28 May 2021).
[41] BGI Consulting (2019), Viešųjų intervencijų ir išlaidų vertinimo , stebėsenos rodiklių metodikų bei įrankių sukūri mo paslaugos (Public Interventions, Spending Review and Monitoring Indicators Methodologies and Tools Development Services).
[4] Bridgeland, J. and P. Orszag (2013), Can Government Play Moneyball?, http://www.theatlantic.com/magazine/archive/2013/07/can-government-play-moneyball/309389/ (accessed on 11 May 2021).
[23] Dobbins, M. et al. (2009), “A randomized controlled trial evaluating the impact of knowledge translation and exchange strategies”, Implementation Science, Vol. 4/1, p. 61, http://dx.doi.org/10.1186/1748-5908-4-61.
[26] European Parliament (2013), Regulation (EU) No 1303/2013 of the European Parliament and of the Council of 17 December 2013, https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32013R1303&from=en (accessed on 11 May 2021).
[25] Gaarder, M. and B. Briceño (2010), “Institutionalisation of government evaluation: balancing trade-offs”, Journal of Development Effectiveness, Vol. 2/3, http://dx.doi.org/10.1080/19439342.2010.505027.
[16] Gerson, D. (2020), “Leadership for a high performing civil service: Towards senior civil service systems in OECD countries”, OECD Working Papers on Public Governance, No. 40, OECD Publishing, Paris, https://dx.doi.org/10.1787/ed8235c8-en.
[38] Government of Finland (n.d.), Implementation of Government Programme, https://valtioneuvosto.fi/hallitusohjelman-toteutus/karkihankkeiden-toimintasuunnitelma (accessed on 25 June 2021).
[34] Government of Lithuania (2021), Aštuonioliktosios Lietuvos Respublikos Vyriausybės programos nuostatū įgyvendinimo planas Nr. 21-20649 (The Implementation Plan of the 18th Government of the Republic of Lithuania), https://lrv.lt/uploads/main/documents/files/VPN%C4%AEP%20projektas.pdf (accessed on 27 May 2021).
[44] Government of Lithuania (2021), Ex-post Vertinimo metodologija (Ex-post Evaluation Methodology).
[20] Government of Lithuania (2021), Strateginio Valdymo Metodika (Strategic Governance Methodology), https://e-seimas.lrs.lt/portal/legalAct/lt/TAD/5a6f68c4a8e511eb98ccba226c8a14d7?positionInSearchResults=0&searchModelUUID=ec7e06f0-8ee4-4478-a94c-1510659cba79 (accessed on 12 May 2021).
[30] Government of Lithuania (2020), 2021–2030 Metų Nacionalinis Pažangos Planas (National Progress Plan 2021-2030), https://e-seimas.lrs.lt/portal/legalAct/lt/TAP/1f5eadb1f27711eab72ddb4a109da1b5?positionInSearchResults=0&searchModelUUID=0b263687-74a4-48c4-b8a8-8e441cde702e (accessed on 11 May 2021).
[27] Government of Lithuania (2014), Resolution on Confirmation of Administrative Rules EU funds investments Operational Programmes.
[36] Government of Scotland (n.d.), National Performance Framework, https://nationalperformance.gov.scot/ (accessed on 25 June 2021).
[17] Haynes, A. et al. (2018), “What can we learn from interventions that aim to increase policy-makers’ capacity to use research? A realist scoping review”, Health Research Policy and Systems, Vol. 16/1, p. 31, http://dx.doi.org/10.1186/s12961-018-0277-1.
[42] INTOSAI (2016), Guidelines on the Evaluation of Public Policies, https://www.intosaicommunity.net/wgeppp/wp-content/uploads/2019/08/INTOSAI-GOV-9400_ENG.pdf (accessed on 26 May 2021).
[22] Langer, L., J. Tripney and D. Gough (2016), The science of using science: Researching the use of Research evidence in decision-making.
[2] Ledermann, S. (2012), “Exploring the Necessary Conditions for Evaluation Use in Program”, American Journal of Evaluation, Vol. 33/2, pp. 159-178, http://dx.doi.org/10.1177/1098214011411573.
[28] Ministry of Finance (2017), Kiekybinis ir kokybinis 2007–2013 m. veiksmų programų pasiektų tikslų ir uždavinių vertinimas: integruota ataskaita (Qualitative and Quantitative Evaluation of the achievement of 2007-2013 Operational Programmes’ Objectives: Integrated Report, Ministry of Finance, Vilnius.
[31] Nakrosis, V., R. Vilpisauskas and J. Detlef (2019), Sustainable Governance Indicators 2019: Lithuania Report.
[5] NAO (2013), Evaluation in Government, http://www.nao.org.uk/report/evaluation-government/ (accessed on 11 May 2021).
[6] Newman, J., A. Cherney and B. Head (2017), “Policy Capacity and Evidence-Based Policy in the Public Service”, Public Management Review, Vol. 43/5, pp. 17-24, http://dx.doi.org/10.1080/14719037.2016.1148191.
[14] Newman, K., C. Fisher and L. Shaxson (2012), “Stimulating Demand for Research Evidence: What Role for Capacity-building?”, IDS Bulletin, Vol. 43/5, pp. 17-24, http://dx.doi.org/10.1111/j.1759-5436.2012.00358.x.
[3] OECD (2020), Building Capacity for Evidence-Informed Policy-Making: Lessons from Country Experiences, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/86331250-en.
[19] OECD (2020), Improving Governance with Policy Evaluation Lessons From Country Experiences, OECD Publishing, Paris, https://doi.org/10.1787/89b1577d-en.
[35] OECD (2018), OECD Public Governance Reviews: Paraguay: Pursuing National Development through Integrated Public Governance, OECD Public Governance Reviews, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264301856-en (accessed on 30 January 2020).
[15] OECD (2017), Core Skills for Public Sector Innovation, https://www.oecd.org/media/oecdorg/satellitesites/opsi/contents/files/OECD_OPSI-core_skills_for_public_sector_innovation-201704.pdf (accessed on 24 May 2021).
[18] OECD (2017), Policy Advisory Systems: Supporting Good Governance and Sound Public Decision Making, OECD Public Governance Reviews, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264283664-en.
[39] OECD (2017), “Spending review”, in Government at a Glance 2017, OECD Publishing, Paris, https://dx.doi.org/10.1787/gov_glance-2017-42-en.
[11] OECD (2016), The Governance of Inclusive Growth, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264257993-en.
[37] OECD (2015), OECD Public Governance Reviews: Estonia and Finland: Fostering Strategic Capacity across Governments and Digital Services across Borders, OECD Public Governance Reviews, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264229334-en.
[40] OECD (Forthcoming), OECD Best practices on Spending Reviews.
[33] Office of Government (2020), Keičiama Lietuvos Respublikos Strateginio Valdymo Sistema (Changes in Strategic Governance System of the Republic of Lithuania).
[8] Parliament of Lithuania (2021), Dėl Lietuvos Respublikos Seimo statuto Nr. I-399 145 straipsnio pakeitimo (Resolution on the amendment of the article 145 of the Statute of Parliament nr. I-399), https://e-seimas.lrs.lt/portal/legalAct/lt/TAK/c262e7005fab11eb9954cfa9b9131808?positionInSearchResults=5&searchModelUUID=b85032a5-89fb-416b-a579-466488a48b28 (accessed on 26 May 2021).
[32] Parliament of Lithuania (2020), Strateginio valdymo įstatymas Nr. XIII-3096 (Law on Strategic Governance).
[21] Parliament of Lithuania (2012), Teisėkūros pagrindų įstatymas XIII-3243, Law on the Legislative Framework of the Republic of Lithuania (last amended 26 June 2020).
[9] Parliament of Lithuania (1999), Lietuvos Respublikos valstybės tarnybos įstatymas XIII-1370 (Law on Civil Service of the Republic of Lithuania) (last amended 10 November 2020).
[10] Parliament of Lithuania (1999), Lietuvos Respublikos viešojo administravimo įstatymas VIII-1234 (Law on the Public Administration of the Republic of Lithuania) (last amended 11 June 2020).
[7] Parliament of Lithuania (1994), Lietuvos Respublikos Seimo Statutas (The Statute of Seimas of the Republic of Lithuania).
[24] Patton, M. (1978), Utilization-focused evaluation, Beverly Hills.
[29] PricewaterhouseCoopers and ESTEP Vilnius (2019), Lietuvos Ūkio Sektorių Finansavimo po 2020 Metų Vertinimas (Financing of Lithuanian Economic Sectors after 2020).
[1] Results for America (2017), 100+ Government Mechanisms to Advance the Use of Data and Evidence in Policymaking: A Landscape Review.
[12] Stevahn, L. et al. (2005), “Establishing Essential Competencies for Program Evaluators”, ARTICLE American Journal of Evaluation, http://dx.doi.org/10.1177/1098214004273180.
[43] Supreme Audit Institution (2020), 2019 veiklos ataskaita (2019 Activity Report).
Notes
← 1. It is accessible on the website of STRATA https://strata.gov.lt/lt/poveikio-vertinimas/atliktu-poveikio-vertinimu-katalogas
← 2. For information of the governments strategic agenda, see https://lrv.lt/lt/aktuali-informacija/xviii-vyriausybe/ministro-pirmininko-strateginiu-darbu-projektu-portfelis