6. Emerging technology governance: Towards an anticipatory framework

Emerging technologies have a central role to play in our collective future. They will help reshape the infrastructure and capacities of our societies and help drive our economies and our behaviour in new ways. While problems like climate change and global health disparities cannot be solved by technology alone, technology policy can be a pivotal factor in the responsiveness and resilience of our sociotechnical systems in the face of crisis. section

In addition to the great promise of emerging technologies for green transitions and other crucial societal objectives, rapid technological change can carry negative consequences and risks for individuals, societies, and the environment. Relevant threats include social disruption, various kinds of inequity, privacy, and human rights. For example, facial recognition and spyware are becoming a tool in mass surveillance (Ryan-Mosley, 2022[1]), social media is a known vector for the active propagation of misinformation (Matasick, Alfonsi and Bellantoni, 2020[2]), and reported mandatory involvement in genomics research violates human rights standards (Wee, 2021[3]).

Emerging technology also carries major implications for distributive justice, geopolitics, and security. While COVID-19 vaccines have been so critical in alleviating illness in high-income countries, they have reached low- and middle-income countries unevenly. As previous chapters have discussed, calls for technological independence – at best, “technological sovereignty” (Crespi et al., 2021[4]) and at worst, new forms of techno-nationalism (Capri, 2019[5]) – have strained international science and technology co-operation, in the same vein as what might be called a “security turn” in innovation policy (see Chapter 1). The globalisation of emerging technologies has also revealed supply chain vulnerabilities, with implications for economic resilience.

Given the double-edged nature of emerging technology, good technology governance might encourage the greatest societal benefit from technology and help prevent social, economic, and political harms. Technology governance can be defined as “the process of exercising political, economic and administrative authority in the development, diffusion and operation of technology in societies” (OECD, 2018[6]). In the context of emerging technologies, the concept of governance has evolved in response to high uncertainty (Folke et al., 2005[7]), risk (Baldwin and Woodard, 2009[8]), complexity (Hasselman, 2016[9]) and the need for co-operation (Sambuli, 2021[10]). From setting rules on the integrity of science to establishing norms for biosecurity and responsible neurotechnology (OECD, 2019[11]), technology governance provides norms and standards for both the bottom-up research that drives discovery, and the application and use of technologies in society.

Perhaps for these reasons, technology governance has attracted increasing attention at a high political level. In recent years, several international fora have focused on the topic of technology governance, including France’s “Technology for Good” initiative (Tech For Good Summit, 2020[12]), the United Kingdom’s “Future Tech Forum” under its 2021 Group of Seven Presidency (HM Government, 2022[13]) and the initiative on “Democracy-Affirming Technologies” launched at President Biden’s Summit on Democracy (The White House, 2021[14]). At the OECD, the Global Forum on Technology was initiated in 2022 to foster multi-stakeholder collaboration on digital and emerging technology policy (see Box 3.7), and the 2021 Recommendation of the Council for Agile Regulatory Governance to Harness Innovation sets norms for rethinking governance and regulatory policy to better harness the societal impacts of innovation (OECD, 2021[15]). Furthermore, the United States and the United Kingdom recently announced an initiative on “privacy-enhancing” technology (The White House, 2021[16]). In the same vein, the need for “human-centric” artificial intelligence (AI) has become a refrain across the public and private sectors and the subject of an influential soft-law instrument at the OECD (OECD, 2019[17]).

These nascent efforts at international technology governance often frame the challenge as one of better regulation. Although it is no doubt one component of the technology governance challenge, this framing arguably does not address a general and recurring problem across critical and emerging technologies such as AI, robotics and synthetic biology: as their development advances, their impacts on society become more profound, and their effects more entrenched (OECD, 2018[18]). It follows that shaping them, without undue restriction, during the innovation process could carry great societal utility.

Efforts to exercise political, economic, or administrative authority during the innovation process might be called “upstream” or “anticipatory technology governance”. Such an approach to governance shifts the locus from exclusively managing the risks of technologies to engaging in the innovation process itself. It aims to anticipate concerns early on, address them through open and inclusive processes, and align the innovation trajectory with societal goals (OECD, 2018[18]). Of course, a balance must be struck between preserving space for serendipitous technology development and shaping technology trajectories through upstream governance.

Actors in the field of international technology governance invoke the need to promote “shared values” – which in the context of these initiatives tend to include the values of democracy, human rights, sustainability, openness, responsibility, security and resilience (e.g. (Council of Europe, 2019[19]) or the (US State Department, 2020[20]). To the extent that it can help embed values within the innovation process itself, the anticipatory approach to technology governance might be better positioned to enact a stated goal of values-affirming technology rather than post-hoc regulatory approaches.

This chapter does not aim to identify the core substantive values that should guide technological development, or to reconcile different positions on them. Instead, it analyses the following question: given that the democratic community is increasingly asserting that values should be embedded in and around technology (e.g., non-discrimination in A.I. algorithms), how should this be accomplished? Governments are increasingly recognising and aiming to address this challenge. All these initiatives are based on an important premise: technology should no longer be viewed as an autonomous agent, but as a system which, through governance, can better serve societal goals and values.

This chapter documents and analyses a set of design criteria and tools that could guide this approach to elaborate an anticipatory framework for emerging technology governance. It is not intended to provide an exhaustive review of design criteria and tools. Rather, it provides a framework for further analytical and normative work, suggesting ideas for the design of good technology governance systems. Figure 6.1 shows this framework, linking values, design criteria, and mechanisms and tools. The chapter explores how actors can implement these design criteria for governance, using policy tools.

In areas other than science, technology, and innovation (STI), anticipatory governance has emerged as a key challenge for governments as they try to move from a reactive stance towards addressing the complexities and uncertainties of the economic and political present (OECD, 2022[21]). Likewise, actors in the STI system have been laying the groundwork for anticipatory technology governance for some time (Guston, 2013[22]), in part under the banner of responsible research and innovation (von Schomberg, 2013[23]). An important aim of this upstream approach is to align research and development (R&D) of cutting-edge technology with key societal goals, whether related to energy transitions, health systems or mobility. To do so, anticipatory governance aims to identify possible stakeholder concerns and values, address them through open and inclusive processes, and embed shared values in the development of new technologies.

The responsible research and innovation approach argues that embedding responsibility and accountability in the activities of researchers, firms and other actors can help orient new technologies towards meeting grand challenges, rather than just decreasing the likelihood of undesirable effects of technologies (Shelley-Egan et al., 2017[24]; Owen, von Schomberg and Macnaghten, 2021[25]). This is consistent with the turn towards mission-oriented Innovation policy (Larrue, 2021[26]) and is the cornerstone of the Recommendation of the OECD Council on Responsible Innovation in Neurotechnology (OECD, 2019[11]).

Actors in both the public and private sectors are starting to take a more proactive approach to technology governance, engaging in activities like anticipatory agenda-setting, test beds, and value-based design and standardisation as a means of addressing societal goals upstream (OECD, 2018[6]). National actors are beginning to promote a holistic view of the challenges and opportunities inherent to the governance of emerging technologies. They are developing frameworks to address recurring concerns and approaches, thereby facilitating learning across technology areas. The National Academy of Medicine in the United States, for instance, recently published a framework for the governance of emerging medical technologies (Mathews, Balatbat and Dzau, 2022[27]). In addition, regulatory communities have already convened at the OECD with the objective of reforming regulatory governance to better harness innovation (OECD, 2021[15]).

Taken as a whole, recent activities in emerging technology governance can be grouped under a policy framework comprising values, design criteria and tools to putting shared values into practice (see Figure 6.1). These components lay the foundation for discussions on emerging technology governance. Each of these elements is outlined below.

Key values orient governance systems, and therefore ground the model. They are not always explicit, and the tools described below may be necessary to surface them. This element answers the question of what is worth ensuring, enabling, and embedding – and why. The (OECD, 2021[28]) has affirmed, among others, democracy, human rights, good governance, security, sustainability, and open markets as shared values. However, it is not the purpose of this chapter to posit particular values for the governance community. This policy framework advances techniques of a process-based approach, laying out tangible strategies for promoting values through design criteria and tools at different stages of the innovation process (OECD, 2018[6]). In practice, it sets out what might be considered guidance for responsible innovation and the development of “values-based technology”.

Design criteria define the generalisable characteristics of good technology governance and responsible innovation. Although this is not a comprehensive list, they should be based on the design criteria of anticipation, inclusivity and alignment, adaptivity and international co-operation.

  • Anticipation. Technology governance faces a dilemma. Governing emerging technologies too early in the development process could be overly constraining, while governing them later can be expensive or impossible. Navigating the so-called “Collingridge dilemma” (Worthington, 1982[29]) requires a form of governance that operates “upstream” and throughout the process of scientific discovery and innovation. Prediction of a particular technological trajectory is notoriously difficult or even impossible, but exploration of possible technologic developments is necessary and can create policy options.

  • Inclusivity and alignment. Involving a broad array of stakeholder groups, including actors typically excluded from the innovation process (e.g., small firms, remote regions, and certain social groups, including minorities) is important to align science and technology with future user needs and values. Inclusivity encompasses access both to technology itself and to the processes of technology development, where enriching the diversity of participants is linked to the creation of more socially relevant science and technology (OECD, forthcoming[30]). A related point is the need to include and integrate diverse disciplines and approaches in the R&D process in order to build richer understandings and fit-for-purpose design (Kreiling and Paunov, 2021[31]; Winickoff et al., 2021[32]), (OECD, 2020[33]).

  • Adaptivity. The pace, scope and complexity of innovation pose significant governance challenges for governments (Marchant and Allenby, 2017[34]) and technology firms. As emerging technologies can have unforeseen consequences, and adverse events or outcomes may occur, the governance system must be adaptive to build resilience and stay relevant – a central tenet of the Recommendation of the Council for Agile Regulatory Governance to Harness Innovation (OECD, 2021[15]). Adaptivity as a design criterion is closely related to anticipation, in that adaptive principles and guidelines might be better suited to the fast pace of technological development.

An array of tools could help realise the above design criteria and embed values in the innovation process (see Figure 6.1). They are the operational element of the framework, the means to take action and govern emerging technologies. The following sections introduce three sets of tools that seek to advance the design criteria: forward-looking technology assessment (TA) promotes anticipation; societal engagement encourages inclusivity and alignment; soft law mechanisms can bolster adaptivity; for international co-operation, all three tools are important. These tools have strong corollaries with known tools for regulators (OECD, 2021[35]), but explicitly seek to engage STI actors – including research funders and agenda setters, researchers and engineers, entrepreneurs and small business, and industry – further upstream, i.e., during the technology development process. Together, these tools constitute a non-exhaustive package of policy interventions to implement anticipation, inclusion and alignment, adaptivity and international co-operation.

The framework in this chapter (as shown in Figure 6.1) aims to guide both national and international policy makers. The development, use and effects of technologies span national borders. The global scope of technological challenges creates a need for an international approach to the governance of emerging technologies. This scope carries implications for the design of both national and international technology governance systems. For national governments, this means that effective governance will require international policy engagement. This engagement is already a clear policy trend, exemplified by the numerous international activities noted above. International co-operation can grow around shared values, and the sharing of tools and good practices, and these in turn can guide national approaches (see Chapter 2).

The treatment of different technologies under such a holistic framework must not be a one-size-fits-all approach. Governance needs for advanced nanomaterials will differ from those relating to new digital platforms or synthetic biology. Indeed, the appropriate approach will depend on the technology’s characteristics, such as:

  • its level of readiness for commercialisation

  • the profile of risks and potential benefits in the short and long term, as viewed by experts and the public

  • the nature of local, national, and international matters of concern

  • the level of public concern.

Nevertheless, applying a common framework at the national and international levels is important as these emerging technologies share certain characteristics, such as uncertain trajectories and impacts, enabling broad areas of follow-up work, potential issues of public trust and the need for value-based reflection (Mathews, 2017[36]). These common characteristics make common tools – including those that follow – highly relevant.

The governance of early-stage technologies poses a set of challenges that require forward-looking knowledge and analysis. This strategic intelligence can be defined as usable knowledge that supports policy makers in understanding the impacts of STI and potential future developments. (Kuhlmann, 2002[37]) identified several processes that could provide such “futures intelligence”, such as technology assessment (TA), technology foresight, anticipatory impact assessment and formative approaches to evaluation.

Emerging and early-stage technologies not only carry inherent uncertainties and complexities, but there are also situations where their desirability is unclear (e.g., human germline gene editing) because the promised novelty may well transcend existing ethical and political evaluations. The Collingridge dilemma sums up the challenge to find the right time to govern technology using dedicated standards, rules, regulations and/or laws. To navigate this dilemma, new kinds of anticipation and strategic intelligence are essential (Robinson et al., 2021[38]).

This section focuses on TA as a source of strategic intelligence. It presents the rationales for TA, the trends shaping TA-based strategic intelligence and concludes with a review of challenges and policy considerations.

TA is an evidence-based, interactive process designed to bring to light the societal, economic, environmental, and legal aspects and consequences of new and emerging science and technologies. TA informs public opinion, helps direct R&D, and unpacks the hopes and concerns of various stakeholders at a given point in time to guide governance. Informally, various forms of TA have been in operation since the dawn of science and technology policy. Formally, TA began 50 years ago with the establishment of the Office for Technology Assessment (OTA) within the United States Congress.1 Its mission was to identify and consider the existing and potential impacts of technologies, and their applications in society. OTA emphasised the need to anticipate the consequences of new technological applications, requiring robust and unbiased information on their societal, political, and economic effects.

Following in the footsteps of OTA, parliamentary TA institutions also emerged in Europe. The Netherlands Organisation for Technology Assessment, for example, was established in 1986 to inform the Dutch Parliament on the developments and potential consequences of new technologies.2 Parliamentary TA institutions proliferated around the globe throughout the 1990s and 2000s. TA and TA-like processes have diversified with different (or expanded) objectives and are conducted in different situations and settings. One evolution is the expansion from expert-oriented TA activities to more participatory TA approaches. Participatory TA acknowledges that technology and society are entwined, further proof that underlying values should be part of the TA process (Delvenne and Rosskamp, 2021[39]).

The main rationales of TA for emerging technology governance can fit into three broad and sometimes overlapping categories.

TA for informing decision makers on key technology trends. One role of TA is as a process of sense-making around emerging technologies, their state-of-the-art and their potential benefits and risks, be they economic, societal, or environmental. When addressing emerging and converging technologies such as synthetic biology, neurotechnology and quantum computing, TA must grapple with high degrees of uncertainty along multiple dimensions. It therefore serves an important function in structuring disparate and unclear information and translating it into usable information that can inform decision-making.

TA for deliberation by gauging stakeholders’ hopes and concerns. Some forms of TA, such as participatory TA, brings together different stakeholder groups, which not only stimulates public and political opinion-forming on the societal and ethical aspects of STI, but also helps promote public trust through engagement and inclusion, one of the key design criteria in the framework. Participatory TA approaches are particularly relevant for probing and highlighting hopes and concerns around potentially disruptive and controversial technologies. Here, the inclusion of relevant stakeholders is key not only for providing democratic legitimacy and building trust, but also for deepening knowledge and expertise. Such stakeholders include associations of small and medium-sized enterprises (SME), civil society organisations, non-governmental agencies, trade unions, consumer groups and patient associations. Thus, integrating a variety of stakeholders and insights can help create a form of “distributed intelligence” (Kuhlmann et al., 1999[40]). However, critics of participatory TA highlight potential weaknesses, such as the lack of impact on decision-making, the lack of support of mainstream science and technology policy, and the exclusion of diverse kinds of knowledge (Hennen, 2012[41]).

TA as means of building and steering technological and industrial agendas. Building national competitiveness through targeted investment in different areas of science and technology R&D is a key aspect of STI policy, in which TA can play a supportive role. For example, following the Portuguese Resolution of the Council of Ministers, the Ministry for Science and Higher Education commissioned the Portuguese Foundation for Science and Technology (FCT) to develop 15 thematic research and innovation agendas. Among them, the Industry and Manufacturing Agenda 2030 mobilised experts from R&D institutions and companies to prospect potential opportunities and challenges for the Portuguese research and innovation system in the medium and long term. The agendas’ main objective was to promote collective reflection on the knowledge base required to pursue the scientific, technological, and societal goals in a given thematic area. FCT facilitated a bottom-up approach through an inclusive process involving experts from academia, research centres, companies, public organisations, and civil society.3

Some TAs combine all three rationales. One example is the Novel and Exceptional Technology and Research Advisory Committee (NExTRAC) at the National Institutes of Health (NIH) in the United States. NIH undertakes horizon-scanning and sense-making of new technologies; deliberates on ethical, legal, and societal issues with a variety of stakeholders; and directly informs the NIH director in agenda-setting (National Institutes of Health, 2021[42]).

Since the founding of the OTA 50 years ago, there has been growing recognition that timely intelligence for STI policy and governance is necessary. Not only are technologies becoming more complex and more pervasive, but they are evolving rapidly with potential new and disruptive risks to the economy, environment, and society. While prudent STI policy and governance for emerging technologies mobilises strategic intelligence in various ways (Tuebke et al., 2001[43]), new trends are challenging established strategic intelligence practices to incorporate new needs. Stemming from a mixture of technological developments, new STI policy approaches and exogenous shocks, these trends produce new requirements for TA processes and outcomes.

Technology trends: The pace of convergence. The escalating and transformative interaction among seemingly distinct technologies, scientific disciplines, communities, and domains of human activity are achieving new levels of synergism (Roco and Bainbridge, 2013[44]). This “convergence” at different loci of the STI system means that ideas, approaches, and technologies from widely diverse fields of knowledge become relevant and necessary for analysing the potential impacts of such convergent systems (National Research Council, 2014[45]). Thus, convergence is placing new demands on strategic intelligence and TA to capture its implications for sociotechnical change.

Innovation policy trends: Mission-orientation. One major STI policy trend is the shift towards greater directionality (Borrás and Edler, 2020[46]), a theme treated in detail in Chapter 5. So-called “mission-oriented” innovation policies seek to steer research and innovation systems so that they contribute to achieving a societal goal (Robinson and Mazzucato, 2019[47]; Larrue, 2021[26]; Mazzucato, 2018[48]). Such approaches require expounding values within ambitious, clearly defined, measurable and achievable goals within a binding time frame (Lindner et al., 2021[49]). Missions envision large transformations. They pressure TA to move from techno-centric approaches focusing on a particular technology and its ramifications, to exploring portfolios of technologies (e.g., related to mobility, energy production and waste management) and how they might impact and drive transformations in value chains, industries, and whole sociotechnical systems. In Germany, the federal government’s most recent funding instrument, “INSIGHT”, promotes a holistic, forward-looking impact assessment of innovations. In addition to the natural and technical sciences, the assessment includes ethical, social, legal, economic, and political considerations. Acknowledging the increasing importance of social innovations, the focus shifts from “pure” technology analysis to including societal developments in the innovation processes.

Crises and societal missions are driving what could be termed “solution-centric” TA. In the Netherlands, the Rathenau Institute develops TAs focusing on problems such as deepfakes (synthetic media) (STOA, 2021[50]) and cyber resilience (van Boheemen et al., 2020[51]). In the United States, the Government Accountability Office (GAO) has been focusing on problems like reducing freshwater use in hydraulic fracturing and power plant cooling and tracing the source of chemical weapons (GAO, 2020[52]). One recent TA by GAO assesses the vaccine development chain for infectious diseases (see Figure 6.2). Here, the goal was to identify key technologies that could enhance the ability of the United States to respond rapidly and effectively to high-priority infectious diseases through rapid vaccine development.

Exogenous forces: A proliferation of crises. Proliferating crises – e.g., the COVID-19 pandemic, Russia’s war of aggression against Ukraine and the subsequent energy crisis, and the local effects of extreme events such as droughts, flooding and forest fires linked to climate change – reshape the requirements for strategic intelligence. As a recent example, the rapid spread of COVID-19 caught most nations off guard, requiring accelerated technology development and deployment of vaccines and defibrillators, as well as knowledge about the virus, its spread, and mutations. Governments around the world had to deal with a crisis featuring high scientific uncertainty, making rapid decisions that would affect national populations and beyond, owing to mobility restrictions. Crises require urgent action. They put pressure on the production of useful and timely strategic intelligence to shape actions in near-real time. TA practitioners are also challenged to incorporate detailed investigations in the rapid scaling and diffusion of new and emerging technologies, and to consider the societal, economic, and environmental effects of rapid scaling.

While global TA practice is still rife with techno-centric TA activities, solution-based and crisis-driven TAs are increasing, bringing with them many questions regarding tools and processes. How wide a portfolio of technologies is there to explore? What is the scope of the TA study? What sort of inclusion is needed to build trust and harness collective intelligence? How rapidly is the intelligence from TA needed for decision-making, and how does this balance with the depth and breadth of TA analysis?

The trend towards mission-oriented innovation policies (see Chapter 5) requires identifying and enacting core societal values that should drive technical change. TA is well placed to spell out these values, particularly around controversial technologies. However, the increasing complexities of emerging technologies and their impacts make it necessary to move beyond techno-centric perspectives. Adopting a socio-centric approach, in turn, increases the complexity and information requirements of not only technology options, but also of the value chains and systems involved.

Crises increase demands on rapid sourcing and scaling of technology solutions. However, uncertainty in both the emerging technology options and the impacts generated as they scale increases the need for controlled speculation on both the mechanisms for rapid scaling and the various facets of scaling. TA and other intelligence sources, such as Foresight, are potential approaches to this end.

Achieving an anticipatory system of technology governance will require recognising the central role of citizens and stakeholders in ensuring the use of trusted and trustworthy technology in society. Contemporary sociological accounts of the relationship between science, technology and society demonstrate that knowledge is increasingly produced in contexts of application, publics are aware of how STI affect their interests and values, and these interests can shape innovation (Jasanoff, 2007[56]). The numerous forms of stakeholder participation in the communication and making of science and technology contradict the so-called “deficit model” of publics as largely ignorant and irrational (Wynne, 1991[57]). But misunderstandings still exist (Chilvers and Kearnes, 2015[58]). Upstream stakeholder engagement can help frame – and reframe – the issues at stake (Jasanoff, 2003[59]) and “open up” important new questions (Stirling, 2007[60]). It must also be translated into practice, so experimentation and knowledge sharing will be important. Reviewing a large body of literature on societal engagement in the context of emerging technologies, this section focuses on how to engage societal stakeholders upstream in technology development to promote trust and trustworthiness.

Why is engagement necessary from the perspective of achieving an anticipatory and inclusive technology governance system? First, engagement can surface societal goals for emerging technology at different points in the complex innovation system, from agenda-setting to product design and diffusion, contributing to a better alignment of technological development with social needs (von Schomberg, 2013[23]). Such alignment, unfolding in an iterative process, is one of the key functions of emerging technology governance and responsible innovation.

Second, engaging societal stakeholders earlier in the development process can help spot public sensitivities and ethical shortcomings. Societal stakeholders bring experiential knowledge to societal problems (OECD, 2020[33]) and offer the perspectives of future users (Kreiling and Paunov, 2021[31]). This diversifies the types of expertise that are included during technology development, potentially pointing to application challenges, or raising questions that innovators do not anticipate, even with their knowledge and expertise. Such diversity has the potential to locate certain biases that are built into digital and other technologies. Subsequent design considerations could help foster societal acceptance and avoid backlashes and controversies that could lead to adoption failures (OECD, 2016[61]), and manage expectations for future products and services.

Third, stakeholder engagement promotes public understanding of science and technology, and enhances the societal capacity for deliberating on technological issues. Such deliberation and consultation can breed trust and enrich the relationship between science and society – although pre-ordained consultation can undermine engagement as a trust-building exercise (Wynne, 2006[62]).

Fourth (and related to the first point), societal engagement presents an opportunity to bring representatives from diverse cultures, demographics, ages, social structures, and skill levels to the innovation process. Including their views, and building stakeholder capacity, not only addresses forms of rooted exclusion but could render technologies more relevant to broader social groups.

Use of new digital technologies. Digitalisation advanced the use of atypical engagement formats, such as online tools or immersive virtual-reality technologies and simulation, although traditional paper-based or face-to-face approaches are still used most frequently (BEIS, 2021[63]).

Iterative and sequenced engagement. Staged approaches have become more frequent. One example is the “IdeenLauf” (“flow of ideas”) initiative during German Science Year 2022, which collected societal impulses to inform science and research policy. First, citizens submitted over 14 000 questions for science. Second, the questions were consolidated, complemented by additional texts to provide relevant context, and discussed among scientists and selected citizens. Third, citizens commented on the text via online consultation. The final report was presented to policy makers and researchers in November 2022.4

Directionality: Focus shifts from technologies to missions, goals, and future products. Emerging technologies are often not yet embodied in future products or services, complicating exchanges between technology experts and broader publics. One trending response to this challenge has been to focus the engagement exercise on issues that societal stakeholders can more easily relate to. An example in the area of future mobility is the “GATEway” project in the United Kingdom, which conducted live public trials on connected and autonomous vehicles resulting in insights on public acceptance of, and attitudes towards, driverless vehicles (BEIS, 2021[63]).

Focus on diversity. There exists momentum to ensure age, ethnic, gender, cultural and other forms of diversity in the make-up of the “publics” engaged in consultation. However, practitioners still perceive a diversity gap both in the theory and practices of engagement, resulting in problems for both sides. On the one side, some communities are not solicited and are thus unable to provide inputs. On the other side, technology experts do not learn about the needs and values of these future users.

Engagement techniques can be categorised under three main groups, corresponding to their different purposes (Figure 6.3). Mode 1 (capacity-building) can be viewed as a prerequisite that establishes the conditions for effective societal engagement and democratic governance. Mode 2 (communicate and consult) gathers the views of citizens or informs them, which may have an indirect influence on technology governance decisions. Mode 3 (co-construct technology development) engages societal stakeholders more directly in the construction of science and technology.

Clarity on the rationale for stakeholder engagement and its timing – during, before or in parallel to the technology development process – is essential when deciding on the suitable societal engagement technique. Deliberative capacity-building (Mode 1) acts as foundation or enabler of societal engagement and occurs during and alongside innovation processes. Societal engagement exercises before or during the research planning phase tend to focus on communicating with or consulting societal stakeholders (Mode 2). Engagement efforts to co-construct science and technology pathways (Mode 3) occur during development, e.g., of prototypes or testing at scale.

Anticipatory governance has been defined as “a broad-based capacity extended through society that can act on a variety of inputs to manage emerging knowledge-based technologies while such management is still possible” (Guston, 2013[22]). Mode 1 activities (see Table 6.1) help build the capacity of publics and innovators to engage in deliberative processes and contribute constructively to governance discussions. They can include techniques (such as communication training) aimed at scientists and innovators, programmes to involve them in the science policy process, and multidisciplinary work that embraces the social sciences and humanities. Other engagement techniques (like science and science policy training) focus on journalists and the media.

These activities also tend to focus on assembling and empowering specific stakeholder groups around technology development, design, and governance. For instance, the European Human Brain Project built an inclusive community for the EBRAINS research infrastructure. This network of external collaborators (including patient associations, clinicians, and industry) brings together those who are particularly concerned by future technology applications.5 The project also provides information platforms and games to build knowledge and skills at the interface of science and society. Two examples in the field of synthetic biology are the citizen game “Nanocrafter” and the annual “iGem” student competition,6 both of which also feature community-building elements. The European Commission’s e-learning platform, “Digital Skillup”, is designed for both beginners and advanced users. It helps them explore emerging technologies and their impact on everyday life and offers training on topics like cybersecurity or the digital revolution.7 In the United States, the Science and Technology Policy Fellowships of the American Association for the Advancement of Science place talented scientists and engineers in positions of federal policy making, furthering the training of a cadre of communicators and contributors across the science and society divide.8

Mode 2 pertains to engagement techniques aiming to gather stakeholder views. While their outcomes and influence on the innovation process are often indirect, they do have capacity-building elements. For example, a UK citizen jury exercise to understand public attitudes towards ethical AI also resulted in their gaining a better understanding of automated decision systems (BEIS, 2021[63]).

Mode 2 contains a wide array of mechanisms and processes for soliciting views and attitudes towards emerging technology (see Table 6.2). Processes can vary from a one-off citizen dialogue to a sequence of meetings and conversations lasting many months. An important consideration across many Mode 2 engagement techniques is the need to design engagement spaces. This includes not only the location’s selection, accessibility, and institutional affiliation, but also the types of event formats and interactivity. For example, public outreach in science museums may take the form of exhibitions or room for experimentation. At Science Café events, on the other hand, scientists may engage with lay persons and discuss their research. Each form of consultation requires a different engagement space.

Mode 3 encompasses the wide variety of modalities for direct contribution by stakeholders and even publics to the creation of new knowledge and technology. As shown in Table 6.3, these techniques and processes promote exchanges between innovators and societal stakeholders that may explore complex and controversial questions and capture deeper underlying values and trade-offs. The exchange is bidirectional, resulting in the “co-construction” or “co-creation” of STI (König, Baumann and Coenen, 2021[64]; Kreiling and Paunov, 2021[31]).

Mode 3 engagements can occur at different stages in the innovation process.

  • Agenda-setting: engagement typically occurs in participatory agenda-setting exercises, using formats like “decision theatres” or “social foresight labs”. The rationale is to co-create or inform research agendas (Matschoss et al., 2020[65]) by involving, for example, patient groups (Scheufele et al., 2021[66]). It can also be to integrate the needs of rural areas and indigenous communities in research and innovation processes (Schroth et al., 2020[67]).

  • New knowledge creation: community-based research strives to build equitable partnerships based on long-term commitment and applies interventions that are beneficial to all stakeholders involved (Baik, Koshy and Hardy, 2022[68]). This category also includes different forms of citizen science and transdisciplinary research (OECD, 2020[33]), both of which are premised on the power of experiential and acquired expertise in the creation of new knowledge. One example is the German funding initiative for citizen science, which is extending support to 28 projects in two phases between 2017 and 2024.9

  • Prototype development: the prototype stage is an important innovation milestone, and engagement is increasingly considered critical to its success. The user-centric methods for the development and testing of prototypes have been evolving. For example, (Rodriguez-Calero et al., 2020[69]) identified 17 strategies to engage stakeholders with prototypes during front-end design activities in the area of medical devices.

  • Deployment and testing at scale: Maker spaces have been used to engage societal stakeholders. For example, the “Lorraine Fab Living Lab”10 tests prototypes and prospectively assesses innovative usages, combining elements of FabLabs and Living Labs (Engels, Wentland and Pfotenhauer, 2019[70]).

  • Engagement governing scientific conduct: These occur alongside technology development processes and could result in the development of guidelines, such as on human genome editing (Iltis, Hoover and Matthews, 2021[71]). The term “open innovation” describes the opening of the innovation process. In the private sector, open innovation happens when future consumers are included in “customer co-creation” activities (Piller, Ihl and Vossen, 2010[72]), resulting in “prosumers” (Rayna and Striukova, 2015[73]). Initiatives are underway to build industry tools for engagement. One such initiative is the “Societal engagement with key enabling technologies (SOCKETS”)11 project supported by the European Commission (2020-23), which develops and tests methods to engage citizens in the industrial development and use of key enabling technologies.

Despite their importance, establishing and running engagement initiatives upstream in the innovation process can be challenging, both from a procedural and organisational standpoint. Procedural challenges relate to the context and impact of the engagement exercise. Concretely, this means using the appropriate channels to ensure that inputs from engagement reach relevant decision makers and innovators and that engagement exercises are not perceived as an additional requirement which is met with a “tick-box mentality” of innovators. Moreover, processes tend not to recognise that experts and communities have different stakes, with traditional decision makers having more to gain and marginalised communities potentially having more to lose. Hence, another issue lies in the power relations between technical experts and societal stakeholders (see Chapter 4). Implementing meaningful participation requires capacity-building and training, as well as developing formats, procedures and a framework that enable members of the public to participate in the process (Schroth et al., 2020[67]).

Organisational challenges revolve around selecting and motivating stakeholders. In this respect, both the scope of the perceived societal impact of the technology and the societal relevance of the research are key. In the case of emerging technologies, relevance and urgency for stakeholders may not be high (de Silva et al., 2022[74]). Still, some technology solutions may affect a smaller group of (local) stakeholders, while others could impact broader groups and cover a geographically larger (global) scale. Lack of relevance, expertise, trust, skills, motivation, incentives, time, and financial resources are common engagement barriers across all stakeholder groups.

As diversity, equity and inclusion become dedicated goals, stakeholder differences in terms of knowledge, ways to communicate, values, expectations, contextual understanding, and routes to forming opinions may become even more pronounced. Allowing an open yet focused debate by balancing between an overly narrow and an open framing of the issues is essential to handle differences and disagreement, facilitating deliberation without forcing consensus (Bauer, Bogner and Fuchs, 2021[75]).

Compared to strategic intelligence and societal engagement, norms and institutions are the more typical tools of technology governance through, for example, regulation, rules, and standards by authoritative bodies. However, while they will be necessary in certain situations, formal regulatory approaches that use norms to define permissible and impermissible activities, along with sanction or incentives to ensure compliance, may present disadvantages in more upstream contexts. First, the speed of technological advances makes it difficult for regulation to keep up. Second, novel ethical, social, and economic issues can operate outside or across regulatory jurisdiction and expertise. Third, applications across multiple industries and government agencies can create interagency co-ordination problems. For all these reasons, formal regulatory approaches may be ill-suited to govern emerging technology, at least in the earlier stages of development (Marchant and Wallach, 2015[76]; Hernández and Amaral, 2022[77]; OECD, 2019[78]). Further, attempts to govern emerging technology could derail innovative approaches, prompting concerns that companies and technologies may simply move across borders (Pfotenhauer et al., 2021[79]).

The OECD is rethinking regulatory policy to document and encourage more agile regulatory governance using a wide array of approaches (OECD, 2021[15]). One such approach might be to use principles, standards, guidelines, and codes with moral or political force but without formal legal enforceability. These “soft law” approaches may provide a number of advantages in terms of multisector co-operation and cross-jurisdictional flexibility (García and Winickoff, 2022[80]). For instance, (Gutierrez, Marchant and Michael, 2021[81]) have pointed to the adaptivity of soft law in governing AI, noting that “AI’s dynamic and rapidly evolving nature … make it challenging to keep in place. In these scenarios, soft law…can transcend the boundaries that typically limit hard law and, by being non-binding, serve as a precursor or as a complement or substitute to regulation.” Nevertheless, its effective deployment has both opportunities and challenges. Indeed, soft law is an increasingly important mode of governance for emerging technology (Hagemann, Huddleston and Thierer, 2019[82]). In the current context, soft law – in all its different forms -- should be considered an important tool for achieving an emerging technology governance system that is more anticipatory, inclusive, and adaptive.

Guidelines, standards and codes of practice feature different types and rationales. Organisations create high-level principles that communicate a joint commitment to ideals and values-based operations. Standard-setting bodies – such as the Institute of Electrical and Electronics Engineers (IEEE) and the International Organization for Standardization (ISO) develop technical norms to guide communities of practice. Professional groups and firms also often ask their members to follow certain rules and codes of conduct. Governments can publish guidelines while threatening to pass enforceable laws as a backstop in the event of insufficient adherence. Finally, voluntary programmes, labels or certification schemes may drive markets, and ultimately the adoption of best practices.

In situations where new international legal treaties are rarely achieved, principles can be an attractive modality for international, transnational and/or global actors to make moral and political commitments with some flexibility and accommodation for differences and changing circumstances. Principles can operate at the international level through a number of organisational sources, from the United Nations to the Council of Europe and the OECD. The OECD offers salient examples of public international recommendations that present principles in the field of technology governance. OECD recommendations feature regular reporting requirements by Adherents, to promote progress in their implementation as well as transparency. Recent recommendations and implementation work include:

  • May 2019: the Recommendation of the Council on Artificial Intelligence (OECD, 2019[17]), under which the OECD convened a multi-stakeholder group, developed a practical toolkit, created an “observatory” of existing policies to promote mutual learning, and led to the establishment of a new OECD Working Party on AI Governance

  • December 2019: the Recommendation on Responsible Innovation in Neurotechnology (OECD, 2019[11]), which seeks to anticipate problems during the course of innovation, steer technology towards the best outcomes, and include many stakeholders in the innovation process.

  • October 2021: the Recommendation for Agile Regulatory Governance to Harness Innovation (OECD, 2021[15]), which provides guidance for policy makers to design agile regulations that can address the regulatory challenges and opportunities arising from emerging technologies.

Other important technology governance mechanisms arise at the public and private interface. As a case in point, ISO is an independent, non-governmental international organisation with a membership of 167 national standards bodies. Among other things, ISO sets many technical standards in the arena of emerging technology, which are developed through a stakeholder-driven process at a fairly high level of technical detail. ISO/TR 12885:2018 on health and safety practices in occupational settings of nanotechnologies is a good example of a technical governance standard.12 This standard focuses on the occupational manufacture and use of manufactured nano-objects, and their aggregates and agglomerates greater than 100 nanometres.

Novel and specialised codes of practice in science and engineering are sometimes deployed before new technologies hit the market, when their potential risks and harms are anticipated but not well-known, or the work has significant ethical implications. These can cross over into public funding agencies through policy. A good example of guidelines that have influenced both the public and private sectors are those developed by the International Society of Stem Cell Research (ISSCR) (Box 6.3).

Many companies find it advantageous to work at the industry-wide level to design joint solutions to governance in the form of self-regulation. For example, the biopharmaceutical industry is experiencing intense changes, with a number of frontier technologies impacting the way it does research, commercialises its products, and collaborates with partners and stakeholders across the world. At the industry level, the International Federation of Pharmaceutical Manufacturers and Associations has responded by creating new bodies, like global future health technologies and bioethics working groups, to consider the next generation of risks, benefits, and standards, with a view to updating its “Code of Practice”.13 Another example is the International Gene Synthesis Consortium (Box 6.4), which has developed a strong network and commitment to biosecurity measures in the industry.

Technology-based standards determine the specific characteristics (size, shape, design, or functionality) of a product, process, or production method. These standards are an important form of governance that can emanate from both the private sector (e.g., de facto standards in the form of dominant designs) and the public sector (e.g., government-regulated vehicle safety standards or mobile phone frequency bands). Environmental non-governmental organisations (NGOs) are partnering with industry on the development of product standards for new food products driven by new and emerging technologies. These partnerships can help generate standards or certification schemes that may command premiums in the market.

Co-developed product standards have potential utility for “upstream governance” because retailers can leverage their market power to influence how technology developers are considering unanticipated consequences throughout the supply chain, from design and sourcing to disposal. Companies are accountable as they have a duty to report on their activities to their investors. They have the power to “bake in” these concerns as the new technologies, chemicals and innovations develop.

Recently, the Environmental Defense Fund, a US-based NGO, worked with the private sector to develop principles and standards to ensure the environmental sustainability of cell-based meat and seafood. This information allows companies to assess these products’ potential impacts on human health, the environment and society, and to communicate the implications to stakeholders clearly and transparently (Environmental Defense Fund, 2021[88]). An important question was how to translate the mechanisms and principles of co-design and upstream engagement into practice. Involving multiple stakeholders was key to ensuring the quality and legitimacy of the guidance.

With the “ethics-by-design” or “sustainability-by-design” approach to governance, some firms and regulatory agencies assess and build in the sustainability or ethical implications of new technologies at different stages of technology development. The “Safe(r)-by-Design” concept, for instance, encourages industry to reduce uncertainties and risks to human and environmental safety, starting at an early phase of the innovation process and covering the whole innovation value chain (or life cycle for product development) (OECD, 2022[89]).

This ethics-by-design approach seeks to embed ethics and societal values – such as privacy, diversity, and inclusion – through clear protocols (e.g., search protocols in AI). Analytical tools can serve to assess privacy impacts, safety impacts, diversity, inclusion, and human rights impacts, and avoid bias. At the December 2021 Summit for Democracy, the United States announced new international technology initiatives including International Grand Challenges on Democracy-Affirming Technologies to drive global innovation on technologies that embed democratic values and principles (Matthews, 2021[90]). In July 2022, the United States, and the United Kingdom co-launched “a set of prize challenges to unleash the potential of privacy-enhancing technologies (PETs) to combat global societal challenges”, making sure privacy and trust are at the heart of the design process” (The White House, 2022[91]).

Principles, guidelines, standards, and codes of practice face some challenges. First, they may lack the formal legitimacy of regulations, which are derived from governments’ legislative authority. This means that they may escape some of the formal procedures required to enact regulations, such as transparent and accountable public comment periods, and structured stakeholder engagement.

Second, the efficacy of these systems must be better addressed should “soft law” become an even more important tool (Hagemann, Huddleston and Thierer, 2019[82]). Third, the existence of too many non-binding sets of norms in a particular terrain may cause overlaps, impeding efficacy across the complex system of actors and institutions that make up global governance (Black, 2008[92]).

Technological sovereignty as a concept is becoming more pronounced, and more countries are striving for technological sufficiency – if not clear advantages – in specific domains (see Chapter 2). Yet this movement towards national or regional approaches might be out of step with current demands. The global nature of the challenges facing the world today requires greater technological (or other) co-operation. The question is whether – and how – the technological governance framework addresses these dynamics.

The section above pitched the use of such design criteria and tools at the national level. This framework could also encourage technological co-operation at the international level – first, by reinforcing commitment to common values such as human rights, responsibility, economic co-operation, and democratic governance; and second, by paving the way for the development of international approaches, such as good strategic intelligence, stakeholder and societal engagement, and mechanisms like OECD recommendations. As stated previously, international co-operation is a consideration for good emerging technology governance that spans the gamut of values, design criteria and tools.

As explored in the above section, anticipatory tools can enhance the capacity to spot issues, understand a given technological and governance landscape, and ultimately make better governance decisions. Across the world, TA, strategic foresight, and other forms of strategic intelligence (such as horizon-scanning) are being applied at the national level to inform national STI policies and technology governance.

One clear gap in the landscape of strategic intelligence lies in the international arena. International technology decision-making on the possible limits on geoengineering,14 human augmentation15 and AI will require strategic intelligence sources that are trusted across countries and sectors. Commonly recognised evidence can serve as the foundation of agreement on different forms of governance. The Intergovernmental Panel on Climate Change, which has supported climate co-ordination and co-operation under the Paris Agreement/COP21, is a case in point. Such global forward-looking analysis could be informed by and link to so-called “global observatories”, which aggregate policy approaches and technology developments. The AI Observatory at the OECD is a good model, with its searchable database of AI policies and normative instruments throughout the world, and its hub for expert blog posts and articles. Some have proposed a Global Observatory for Gene Editing which would serve a broader set of functions, notably to enrich ethical, legal, and cultural understandings, and encourage debate among the global citizenry (Jasanoff and Hurlbut, 2018[93]). Collaboration around such efforts at the international level could pool insights on the development and potential impacts of technology, as well as build best practices for collective strategic intelligence.

In this vein, the development of emerging technologies has ramifications for the nature of citizen engagement. For example, geoengineering techniques could affect weather patterns or water supply, with impacts that are not restricted to national borders. AI applications exert profound impacts not only on national, but global, economies. Growing calls for international public deliberation exercises, such as a global citizens' assembly on genome editing (Dryzek et al., 2020[94]), evince the co-emergence of technology and new kinds of global citizenship. Going from the traditionally local or national level to the global scale will require adapting engagement techniques, for example by using formats like World Wide Views.16 However, deciding which stakeholder groups should be involved in global efforts raises questions related to the nature of international publics and the identification of relevant stakeholders.

Addressing governance challenges at the country level runs the risk of being ineffective at best and counter-productive at worst, as particular jurisdictions could exploit the governance gaps to gain advantage. Several of the governance modalities (such as the OECD recommendations) discussed in this chapter operate at an international level, offering an opportunity to co-ordinate and even harmonise different jurisdictions’ approaches. Further, standards emanating from industry groups or public-private partnerships can work transnationally, across and through jurisdictions linked by supply chains, markets, and border-crossing actors.

Technology is driving economies, political systems, and cultures. It promises great advances for human well-being, supporting solutions to grand challenges such as green transitions and pandemics. However, technology developers and users, as well as policy makers, must be mindful of a fine balance between enabling innovation for societal benefit while reducing potential risks to democratic values, e.g., equity, transparency, accountability, that may undermine human rights or have other undesirable societal, political, or economic consequences. While important thinking and tools to regulate technology continue to develop, it is important to note that a co-evolutionary process is taking place between technological development and today’s societal structures. The social and political shaping of technology happens through a myriad of ways and policies, including intellectual property laws, science agenda-setting and funding, and regulatory policy. Here, an anticipatory framework featuring generalisable design criteria and tools can help guide the innovation process to embed values more purposefully into the technology development process.

Anticipation, the inclusion and integration of stakeholders, and adaptability are key design criteria allowing more explicit consideration of values in the technological development process. International co-operation grows out of shared values and informs design criteria and policy tools. But these design criteria must be optimised, using a variety of tools and activities that can drive the embedding process. Forward-looking TA both depends on and supports the expression of key values, which underpin the analysis of potential benefits and harms, and the trajectories of emerging technology. Societal and stakeholder engagement can bring a democratic element to the governance of emerging technology, enabling deliberation on the values that should support and guide technological development. Finally, co-developed standards can endow the governance system with the necessary adaptivity and utility as it sets a normative stance towards technology through standards and guidelines. This framework will not define core human rights and values, but it could clear the way for a more reflective stance towards emerging technologies and the values they embody. As actioned through this pragmatic framework, this stance might ground a more co-operative approach to developing technologies in, for and with societies.


[83] Anthony, E., R. Lovell-Badge and S. Morrison (2021), “New guidelines for stem cell and embryo research from the ISSCR”, Cell Stem Cell, Vol. 28/6, pp. 991-992, https://doi.org/10.1016/j.stem.2021.05.009.

[68] Baik, E., A. Koshy and B. Hardy (2022), “Communicating CRISPR: Challenges and opportunities in engaging the public”, in Progress in Molecular Biology and Translational Science, Molecular Biology and Clinical Medicine in the Age of Politicization, Elsevier, https://doi.org/10.1016/bs.pmbts.2021.11.004.

[8] Baldwin, C. and C. Woodard (2009), The Architecture of Platforms: A Unified View, Edward Elgar Publishing, https://doi.org/10.4337/9781849803311.00008.

[75] Bauer, A., A. Bogner and D. Fuchs (2021), “Rethinking societal engagement under the heading of Responsible Research and Innovation: (novel) requirements and challenges”, Journal of Responsible Innovation, Vol. 8/3, pp. 342-363, https://doi.org/10.1080/23299460.2021.1909812.

[63] BEIS (2021), “The use of public engagement for technological innovation: literature review and case studies”, Department for Business, Energy and Industrial Strategy, https://www.gov.uk/government/publications/the-use-of-public-engagement-for-technological-innovation-literature-review-and-case-studies (accessed on 7 March 2023).

[92] Black, J. (2008), “Constructing and contesting legitimacy and accountability in polycentric regulatory regimes”, Regulation & Governance, Vol. 2/2, pp. 137-164, https://doi.org/10.1111/j.1748-5991.2008.00034.x.

[46] Borrás, S. and J. Edler (2020), “The roles of the state in the governance of socio-technical systems’ transformation”, Research Policy, Vol. 49/5, p. 103971, https://doi.org/10.1016/j.respol.2020.103971.

[5] Capri, A. (2019), “Techno-Nationalism: What Is It And How Will It Change Global Commerce?”, Forbes, https://www.forbes.com/sites/alexcapri/2019/12/20/techno-nationalism-what-is-it-and-how-will-it-change-global-commerce/?sh=62458045710f (accessed on 6 March 2023).

[58] Chilvers, J. and M. Kearnes (eds.) (2015), Remaking Participation, Routledge, Abingdon, Oxon ; New York, NY : Routledge is an imprint, https://doi.org/10.4324/9780203797693.

[86] Cong, L. et al. (2013), “Multiplex Genome Engineering Using CRISPR/Cas Systems”, Science, Vol. 339/6121, pp. 819-823, https://doi.org/10.1126/science.1231143.

[19] Council of Europe (2019), Strategic Action Plan on Human Rights and Technologies in Biomedicine (2020-2025), https://rm.coe.int/strategic-action-plan-final-e/1680a2c5d2 (accessed on 6 March 2023).

[4] Crespi, F. et al. (2021), “European Technological Sovereignty: An Emerging Framework for Policy Strategy”, Intereconomics, Vol. 56/6, pp. 348-354, https://doi.org/10.1007/s10272-021-1013-6.

[74] de Silva, M. et al. (2022), “How did COVID-19 shape co-creation?: Insights and policy lessons from international initiatives”, OECD Science, Technology and Industry Policy Papers, No. 134, OECD Publishing, Paris, https://doi.org/10.1787/e11c5274-en.

[39] Delvenne, P. and B. Rosskamp (2021), “Cosmopolitan technology assessment? Lessons learned from attempts to address the deficit of technology assessment in Europe”, Journal of Responsible Innovation, Vol. 8/3, pp. 445-470, https://doi.org/10.1080/23299460.2021.1988433.

[87] Diggans, J. and E. Leproust (2019), “Next Steps for Access to Safe, Secure DNA Synthesis”, Frontiers in Bioengineering and Biotechnology, Vol. 7, https://doi.org/10.3389/fbioe.2019.00086.

[94] Dryzek, J. et al. (2020), “Global citizen deliberation on genome editing”, Science, Vol. 369/6510, pp. 1435-1437, https://doi.org/10.1126/science.abb5931.

[70] Engels, F., A. Wentland and S. Pfotenhauer (2019), “Testing future societies? Developing a framework for test beds and living labs as instruments of innovation governance”, Research Policy, Vol. 48/9, p. 103826, https://doi.org/10.1016/j.respol.2019.103826.

[88] Environmental Defense Fund (2021), “Innovative Foods: A Guide to Responsible Investment in Cell-cultured Meat and Seafood”, No. accessed on 20 September 2022, https://business.edf.org/files/InnovativeFoods-CellCulturedMeat-Brochure-LOWres-092921.pdf (accessed on 7 March 2023).

[7] Folke, C. et al. (2005), “ADAPTIVE GOVERNANCE OF SOCIAL-ECOLOGICAL SYSTEMS”, Annual Review of Environment and Resources, Vol. 30/1, pp. 441-473, https://doi.org/10.1146/annurev.energy.30.050504.144511.

[53] GAO (2021), “Vaccine Development: Capabilities and Challenges for Addressing Infectious Diseases”, U.S. Government Accountability Office, https://www.gao.gov/products/gao-22-104371 (accessed on 7 March 2023).

[52] GAO (2020), “Tracing the source of chemical weapons”, https://www.gao.gov/assets/gao-21-271sp.pdf (accessed on 7 March 2023).

[80] García, L. and D. Winickoff (2022), “Brain-computer interfaces and the governance system: Upstream approaches”, OECD Science, Technology and Industry Working Papers, No. 2022/01, OECD Publishing, Paris, https://doi.org/10.1787/18d86753-en.

[22] Guston, D. (2013), “Understanding ‘anticipatory governance’”, Social Studies of Science, Vol. 44/2, pp. 218-242, https://doi.org/10.1177/0306312713508669.

[81] Gutierrez, C., G. Marchant and K. Michael (2021), “Effective and Trustworthy Implementation of AI Soft Law Governance”, IEEE Transactions on Technology and Society, Vol. 2/4, pp. 168-170, https://doi.org/10.1109/tts.2021.3121959.

[82] Hagemann, R., J. Huddleston and A. Thierer (2019), “Soft Law for Hard Problems: The Governance of Emerging Technologies in an Uncertain Future”, Colorado Technology Law Journal, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3118539 (accessed on 7 March 2023).

[9] Hasselman, L. (2016), “Adaptive management; adaptive co-management; adaptive governance: what’s the difference?”, Australasian Journal of Environmental Management, Vol. 24/1, pp. 31-46, https://doi.org/10.1080/14486563.2016.1251857.

[41] Hennen, L. (2012), “Why do we still need participatory technology assessment?”, Poiesis & Praxis, Vol. 9/1-2, pp. 27-41, https://doi.org/10.1007/s10202-012-0122-5.

[77] Hernández, G. and M. Amaral (2022), “Case studies on agile regulatory governance to harness innovation: Civilian drones and bio-solutions”, OECD Regulatory Policy Working Papers, No. 18, OECD Publishing, Paris, https://doi.org/10.1787/0fa5e0e6-en.

[13] HM Government (2022), “Future Tech Forum Chair’s Report”, https://www.gov.uk/government/publications/future-tech-forum-chairs-report/future-tech-forum-chairs-report (accessed on 6 March 2023).

[71] Iltis, A., S. Hoover and K. Matthews (2021), “Public and Stakeholder Engagement in Developing Human Heritable Genome Editing Policies: What Does it Mean and What Should it Mean?”, Frontiers in Political Science, Vol. 3, https://doi.org/10.3389/fpos.2021.730869.

[56] Jasanoff, S. (2007), Designs on Nature: Science and Democracy in Europe and the United States, https://press.princeton.edu/books/paperback/9780691130422/designs-on-nature (accessed on 7 March 2023).

[59] Jasanoff, S. (2003), , Minerva, Vol. 41/3, pp. 223-244, https://doi.org/10.1023/a:1025557512320.

[93] Jasanoff, S. and J. Hurlbut (2018), “A global observatory for gene editing”, Nature, Vol. 555/7697, pp. 435-437, https://doi.org/10.1038/d41586-018-03270-w.

[64] König, H., M. Baumann and C. Coenen (2021), “Emerging Technologies and Innovation—Hopes for and Obstacles to Inclusive Societal Co-Construction”, Sustainability, Vol. 13/23, p. 13197, https://doi.org/10.3390/su132313197.

[31] Kreiling, L. and C. Paunov (2021), “Knowledge co-creation in the 21st century: A cross-country experience-based policy report”, OECD Science, Technology and Industry Policy Papers, No. 115, OECD Publishing, Paris, https://doi.org/10.1787/c067606f-en.

[37] Kuhlmann, S. (2002), “Distributed Techno-Economic Intelligence for policymaking”, https://www.urenio.org/e-innovation/stratinc/files/library/22.pdf (accessed on 6 March 2023).

[40] Kuhlmann, S. et al. (1999), “Improving Distributed Intelligence in Complex Innovations Systems”, https://www.semanticscholar.org/paper/Improving-Distributed-Intelligence-in-Complex-Kuhlmann-Boekholt/bacef7fa27b0942bce2aad3b0576cb3fa76e04ec (accessed on 6 March 2023).

[26] Larrue, P. (2021), “The design and implementation of mission-oriented innovation policies: A new systemic policy approach to address societal challenges”, OECD Science, Technology and Industry Policy Papers, No. 100, OECD Publishing, Paris, https://doi.org/10.1787/3f6c76a4-en.

[49] Lindner, R. et al. (2021), “Mission-oriented innovation policy: From ambition to successful implementation”, Perspectives - Policy Brief, No. No. 02 / 2021, Fraunhofer ISI, Karlsruhe, https://www.isi.fraunhofer.de/content/dam/isi/dokumente/policy-briefs/policy_brief_mission-oriented-innovation-policy.pdf (accessed on 7 March 2023).

[27] Malina, D. (ed.) (2022), “Governance of Emerging Technologies in Health and Medicine — Creating a New Framework”, New England Journal of Medicine, Vol. 386/23, pp. 2239-2242, https://doi.org/10.1056/nejmms2200907.

[34] Marchant, G. and B. Allenby (2017), “Soft law: New tools for governing emerging technologies”, Bulletin of the Atomic Scientists, Vol. 73/2, pp. 108-114, https://doi.org/10.1080/00963402.2017.1288447.

[76] Marchant, G. and W. Wallach (2015), “Coordinating Technology Governance”, Issues in Science and Technology 31, No. no. 4 (Summer 2015), https://issues.org/coordinating-technology-governance/ (accessed on 7 March 2023).

[2] Matasick, C., C. Alfonsi and A. Bellantoni (2020), “Governance responses to disinformation : How open government principles can inform policy options”, OECD Working Papers on Public Governance, No. 39, OECD Publishing, Paris, https://doi.org/10.1787/d6237c85-en.

[36] Mathews, D. (2017), When emerging biomedical technologies converge or collide, Oxford University Press, https://doi.org/10.1093/oso/9780198786832.003.0001.

[65] Matschoss, K. et al. (2020), “Co-creating transdisciplinary global change research agendas in Finland”, European Journal of Futures Research, Vol. 8/1, https://doi.org/10.1186/s40309-020-0162-3.

[90] Matthews, D. (2021), “US to push ‘democracy-affirming technology’ with prizes and research projects”, Science Business, https://sciencebusiness.net/news/us-push-democracy-affirming-technology-prizes-and-research-projects (accessed on 7 March 2023).

[48] Mazzucato, M. (2018), Mission-oriented research & innovation in the European Union, https://op.europa.eu/en/publication-detail/-/publication/5b2811d1-16be-11e8-9253-01aa75ed71a1/language-en (accessed on 7 March 2023).

[42] National Institutes of Health (2021), Novel and Exceptional Technology and Research Advisory Committee, Gene Drives in Biomedical Research Report, https://osp.od.nih.gov/wp-content/uploads/NExTRAC-Gene-Drives-Final-Report.pdf (accessed on 6 March 2023).

[45] National Research Council (2014), Convergence, National Academies Press, Washington, D.C., https://doi.org/10.17226/18722.

[89] OECD (2022), “Sustainability and Safe and Sustainable by Design: Working Descriptions for the Safer”, OECD Environment, Health and Safety Publications Series on the Safety of Manufactured Nanomaterials, No. No. 105, https://www.oecd.org/officialdocuments/publicdisplaydocumentpdf/?cote=ENV-CBC-MONO(2022)30%20&doclanguage=en (accessed on 7 March 2023).

[21] OECD (2022), Towards an anticipatory innovation governance model in Finland, https://oecd-opsi.org/wp-content/uploads/2021/09/Anticipatory-Innovation-Governance-in-Finland.pdf (accessed on 6 March 2023).

[35] OECD (2021), “Practical Guidance on Agile Regulatory Governance to Harness Innovation”, https://legalinstruments.oecd.org/public/doc/669/9110a3d9-3bab-48ca-9f1f-4ab6f2201ad9.pdf (accessed on 6 March 2023).

[15] OECD (2021), Recommendation of the Council for Agile Regulatory Governance to Harness Innovation, https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0464 (accessed on 5 March 2023).

[28] OECD (2021), “Trust in Global Co-operation - the vision for the OECD for the next decade”, https://read.oecd-ilibrary.org/view/?ref=1110_1110970-giiac5g3aj&title=MCM-oct-2021-Trust-in-global-co-operation-Mathias-Cormann (accessed on 6 March 2023).

[33] OECD (2020), “Addressing societal challenges using transdisciplinary research”, OECD Science, Technology and Industry Policy Papers, No. 88, OECD Publishing, Paris, https://doi.org/10.1787/0ca0ca45-en.

[17] OECD (2019), Recommendation of the Council on Artificial Intelligence, https://legalinstruments.oecd.org/en/instruments/oecd-legal-0449 (accessed on 6 March 2023).

[11] OECD (2019), Recommendation of the Council on Responsible Innovation in Neurotechnology, https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0457 (accessed on 6 March 2023).

[78] OECD (2019), Regulatory effectiveness in the era of digitalisation, https://www.oecd.org/gov/regulatory-policy/Regulatory-effectiveness-in-the-era-of-digitalisation.pdf (accessed on 7 March 2023).

[18] OECD (2018), OECD Science, Technology and Innovation Outlook 2018: Adapting to Technological and Societal Disruption, OECD Publishing, Paris, https://doi.org/10.1787/sti_in_outlook-2018-en.

[6] OECD (2018), “Technology governance and the innovation process”, in OECD Science, Technology and Innovation Outlook 2018: Adapting to Technological and Societal Disruption, OECD Publishing, Paris, https://doi.org/10.1787/sti_in_outlook-2018-15-en.

[61] OECD (2016), “Public engagement in STI policy”, in OECD Science, Technology and Innovation Outlook 2016, OECD Publishing, Paris, https://doi.org/10.1787/sti_in_outlook-2016-11-en.

[30] OECD (forthcoming), “Technology in and for society: innovating well for inclusive transitions - conference report”.

[25] Owen, R., R. von Schomberg and P. Macnaghten (2021), “An unfinished journey? Reflections on a decade of responsible research and innovation”, Journal of Responsible Innovation, Vol. 8/2, pp. 217-233, https://doi.org/10.1080/23299460.2021.1948789.

[79] Pfotenhauer, S. et al. (2021), “Mobilizing the private sector for responsible innovation in neurotechnology”, Nature Biotechnology, Vol. 39/6, pp. 661-664, https://doi.org/10.1038/s41587-021-00947-y.

[72] Piller, F., C. Ihl and A. Vossen (2010), “A Typology of Customer Co-Creation in the Innovation Process”, SSRN Electronic Journal, https://doi.org/10.2139/ssrn.1732127.

[73] Rayna, T. and L. Striukova (2015), “Open innovation 2.0: is co-creation the ultimate challenge?”, International Journal of Technology Management, Vol. 69/1, p. 38, https://doi.org/10.1504/ijtm.2015.071030.

[47] Robinson, D. and M. Mazzucato (2019), “The evolution of mission-oriented policies: Exploring changing market creating policies in the US and European space sector”, Research Policy, Vol. 48/4, pp. 936-948, https://doi.org/10.1016/j.respol.2018.10.005.

[38] Robinson, D. et al. (2021), “Policy lensing of future-oriented strategic intelligence: An experiment connecting foresight with decision making contexts”, Technological Forecasting and Social Change, Vol. 169, p. 120803, https://doi.org/10.1016/j.techfore.2021.120803.

[44] Roco, M. and W. Bainbridge (2013), “The new world of discovery, invention, and innovation: convergence of knowledge, technology, and society”, Journal of Nanoparticle Research, Vol. 15/9, https://doi.org/10.1007/s11051-013-1946-1.

[69] Rodriguez-Calero, I. et al. (2020), “Prototyping strategies for stakeholder engagement during front-end design: Design practitioners’ approaches in the medical device industry”, Design Studies, Vol. 71, p. 100977, https://doi.org/10.1016/j.destud.2020.100977.

[1] Ryan-Mosley, T. (2022), “The world is moving closer to a new cold war fought with authoritarian tech”, MIT Technology Review, https://www.technologyreview.com/2022/09/22/1059823/cold-war-authoritarian-tech-china-iran-sco/ (accessed on 6 March 2023).

[10] Sambuli, N. (2021), “In my view: The promises, pitfalls and potential of global technology governance”, in Development Co-operation Report 2021: Shaping a Just Digital Transformation, OECD Publishing, Paris, https://doi.org/10.1787/67500f38-en.

[66] Scheufele, D. et al. (2021), “What we know about effective public engagement on CRISPR and beyond”, Proceedings of the National Academy of Sciences, Vol. 118/22, https://doi.org/10.1073/pnas.2004835117.

[67] Schroth, F. et al. (2020), “Participatory agenda setting as a process — of people, ambassadors and translation: a case study of participatory agenda setting in rural areas”, European Journal of Futures Research, Vol. 8/1, https://doi.org/10.1186/s40309-020-00165-w.

[24] Shelley-Egan, C. et al. (2017), “Responsible Research and Innovation in the context of human cognitive enhancement: some essential features”, Journal of Responsible Innovation, Vol. 5/1, pp. 65-85, https://doi.org/10.1080/23299460.2017.1319034.

[60] Stirling, A. (2007), ““Opening Up” and “Closing Down””, Science, Technology, & Human Values, Vol. 33/2, pp. 262-294, https://doi.org/10.1177/0162243907311265.

[50] STOA (2021), Tackling deepfakes in European policy, Panel for the Future of Science and Technology, https://www.rathenau.nl/sites/default/files/2021-08/tackling_deepfakes_in_european_policy_STOA.pdf (accessed on 7 March 2023).

[84] Sun, T. et al. (2022), “Challenges and recent progress in the governance of biosecurity risks in the era of synthetic biology”, Journal of Biosafety and Biosecurity, Vol. 4/1, pp. 59-67, https://doi.org/10.1016/j.jobb.2022.02.002.

[12] Tech For Good Summit (2020), Tech For Good Summit - Progress Report, https://www.elysee.fr/admin/upload/default/0001/08/23c18d41821bd9bb2505555892fcbc19d52a3b5d.pdf (accessed on 6 March 2023).

[95] The Royal Society (2009), Geoengineering the climate: science, governance and uncertainty, accessed on 30 November 2022, https://royalsociety.org/-/media/Royal_Society_Content/policy/publications/2009/8693.pdf (accessed on 7 March 2023).

[91] The White House (2022), “U.S. and U.K. Launch Innovation Prize Challenges in Privacy-Enhancing Technologies to Tackle Financial Crime and Public Health Emergencies”, Press release, https://www.whitehouse.gov/ostp/news-updates/2022/07/20/u-s-and-u-k-launch-innovation-prize-challenges-in-privacy-enhancing-technologies-to-tackle-financial-crime-and-public-health-emergencies/ (accessed on 7 March 2023).

[16] The White House (2021), U.S. and U.K. Launch Innovation Prize Challenges in Privacy-Enhancing Technologies to Tackle Financial Crime and Public Health Emergencies, https://www.whitehouse.gov/ostp/news-updates/2021/12/08/white-house-announces-launch-of-the-international-grand-challenges-on-democracy-affirming-technologies-for-the-summit-for-democracy/ (accessed on 6 March 2023).

[14] The White House (2021), White House Announces Launch of the International Grand Challenges on Democracy-Affirming Technologies for the Summit for Democracy, https://www.whitehouse.gov/ostp/news-updates/2021/12/08/white-house-announces-launch-of-the-international-grand-challenges-on-democracy-affirming-technologies-for-the-summit-for-democracy/ (accessed on 6 March 2023).

[85] Trump, B. et al. (eds.) (2020), Synthetic Biology 2020: Frontiers in Risk Analysis and Governance, Springer International Publishing, Cham, https://doi.org/10.1007/978-3-030-27264-7.

[43] Tuebke, A. et al. (2001), Strategic Policy Intelligence: Current Trends, the State of Play and Perspectives, https://www.researchgate.net/publication/308900288_Strategic_Policy_Intelligence_Current_Trends_the_State_of_Play_and_Perspectives (accessed on 6 March 2023).

[96] UK Ministry of Defence (2021), Human Augmentation – the Dawn of a New Paradigm, accessed on 30 November 2022, https://www.gov.uk/government/publications/human-augmentation-the-dawn-of-a-new-paradigm (accessed on 7 March 2023).

[20] US State Department (2020), U.S.-EU Trade and Technology Council (TTC), https://www.state.gov/u-s-eu-trade-and-technology-council-ttc/ (accessed on 6 March 2023).

[51] van Boheemen, P. et al. (2020), Guidelines for foresight-based policy analysis, https://www.rathenau.nl/sites/default/files/2020-07/REPORT%20Cyber%20resilience%20with%20new%20technology%20-%20Rathenau%20Instituut.pdf (accessed on 7 March 2023).

[54] Van Woensel, L. (2021), Guidelines for foresight-based policy analysis, STOA, https://data.europa.eu/doi/10.2861/39319.

[55] Van Woensel, L. (2020), A Bias Radar for Responsible Policy-Making, Springer International Publishing, Cham, https://doi.org/10.1007/978-3-030-32126-0.

[23] von Schomberg, R. (2013), “A Vision of Responsible Research and Innovation”, in Responsible Innovation, John Wiley & Sons, Ltd, Chichester, UK, https://doi.org/10.1002/9781118551424.ch3.

[3] Wee, S. (2021), “Two Scientific Journals Retract Articles Involving Chinese DNA Research”, The New York Times, https://www.nytimes.com/2021/09/09/business/china-dna-retraction-uyghurs.html (accessed on 6 March 2023).

[32] Winickoff, D. et al. (2021), “Collaborative platforms for emerging technology: Creating convergence spaces”, OECD Science, Technology and Industry Policy Papers, No. 109, OECD Publishing, Paris, https://doi.org/10.1787/ed1e030d-en.

[29] Worthington, R. (1982), “The Social Control of Technology. By David Collingridge. (New York: St. Martin’s Press, 1980. Pp. i + 200. $22.50.)”, American Political Science Review, Vol. 76/1, pp. 134-135, https://doi.org/10.2307/1960465.

[62] Wynne, B. (2006), “Public Engagement as a Means of Restoring Public Trust in Science – Hitting the Notes, but Missing the Music?”, Public Health Genomics, Vol. 9/3, pp. 211-220, https://doi.org/10.1159/000092659.

[57] Wynne, B. (1991), “Knowledges in Context”, Science, Technology, & Human Values, Vol. 16/1, pp. 111-121, https://doi.org/10.1177/016224399101600108.


← 1. The Office for Technology Assessment was formally created in 1972 and closed in 1995.

← 2. In 1994, NOTA was renamed the Rathenau Institute (www.rathenau.nl).

← 3. The agenda can be found at the following link (in Portuguese): https://www.fct.pt/agendastematicas/docs/Agenda_Industria_Manufatura_Final.pdf (accessed 30 September 2022).

← 4. Further information on the German Initiative “IdeenLauf” (in German) available at: https://www.wissenschaftsjahr.de/2022/ideenlauf (accessed 24 November /2022).

← 5. https://ebrains.eu/ (accessed 22 September 2022).

← 6. Further information on Nanocrafter (https://citizensciencegames.com/nanocrafter-playing-game-synthetic-biology/) or iGem online (https://igem.org/) (both accessed 24 September 2022).

← 7. https://www.digitalskillup.eu/ (accessed 22 September 2022).

← 8. https://www.aaas.org/programs/science-technology-policy-fellowships (accessed 22 September 2022).

← 9. Further information on the BMBF citizen science funding programme on STIP Compass online: https://stip.oecd.org/stip/interactive-dashboards/policy-initiatives/2019%2Fdata%2FpolicyInitiatives%2F24328 (accessed 30 November 2022).

← 10. Lorraine Fab Living Lab: https://lf2l.fr/ (accessed 24 September 2022).

← 11. https://tekno.dk/project/sockets/?lang=en) (accessed 25 November 2022).

← 12. https://www.iso.org/standard/67446.html?browse=tc (accessed 22 September 2022)

← 13. https://www.ifpma.org/wp-content/uploads/2018/09/IFPMA_Code_of_Practice_2019-1.pdf (accessed 23 September 2022).

← 14. The UK Royal Society provided one authoritative definition of “geoengineering” in 2009: “the deliberate large-scale manipulation of the planetary environment to counteract anthropogenic climate change” (The Royal Society, 2009[95]).

← 15. The UK Ministry of Defence has proposed a definition of “human augmentation” as “the application of science and technologies to temporarily or permanently improve human performance”, and divides the field further into “human performance optimisation and human performance enhancement” (UK Ministry of Defence, 2021[96]).

← 16. http://wwviews.org/the-world-wide-views-method/ (accessed 04 October 2022).

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2023

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.