Chapter 5. Ensuring that training is of good quality

Ensuring that training is of good quality – e.g. that it is effective, complies with quality standards criteria, and is certified – is fundamental to ensure that available Training Funds resources are put to their best value use. Continuous monitoring of the activities of the Training Funds through good information systems, as well as regular evaluation of the impacts of training on workers’ labour market outcomes and firms’ performance is essential to keep track of the quality of training. This chapter analyses the mechanisms put in place in Italy to ensure that TF-supported training is of good quality.


5.1. Quality assurance mechanisms

The vast majority of TF-supported training is delivered by non-formal training providers that fall outside the responsibility of standard quality-monitoring institutions.1 This makes it difficult to monitor the quality of TF-supported training. Indeed, Table 5.1 shows that employers are the main providers of TF-supported training (49.5% of all training plans), followed by training providers/training agencies (23.1%), consultancy and/or training firms (16.3%), while only a very small minority of all training plans are delivered by formal institutions – e.g. university (0.5%), private/public educational institutions (0.3%), and public/private research centres (0.2%).

While Training Funds can use different strategies to ensure that the training they finance is of good quality, in general there are no homogeneous training quality standards to which all Training Funds need to comply to. The next sections look at how different quality assurance mechanisms can influence TF-supported training quality, looking particularly at three aspects: (i) accreditation of training providers; (ii) funding allocation procedures; and (iii) monitoring of training providers.

Table 5.1. Providers of TF-supported training, 2016

Providers of TF-supported training

Number of training plans

% of all training plans

Beneficiary enterprise

15 282


Training provider/training agency

7 143


Consultancy and/or training firm

5 041


Other enterprise (not beneficiary)






Private/public educational institution



Parent company/controlling undertaking



Private/public research centres



Consortium of beneficiary enterprises



Ecclesiastical institution



Data not available

2 793


All providers

29 088


1. Data refer to training plans approved between January and December 2016.

2. The question provides multiple answer choices. The sum of the individual answers is used for the calculation of shares.

Source: ANPAL’s elaborations based on the Nexus database.

5.1.1. Accreditation of training providers

Each TF has its own accreditation strategy to ensure that funding goes to training providers that are of good quality. Most Training Funds rely on existing regional accreditation systems to select training providers while many Training Funds have developed their own accreditation system.

The lack of a common rule on accreditation procedures, valid for all Training Funds across the whole territory, may bring several challenges. First, it may lead some Training Funds to establish preferential partnerships with training providers, potentially allowing low-quality training providers to obtain Training Funds financing.

Second, it could result in large differences in the quality of training provided across different regions/Training Funds. Indeed, quality criteria vary significantly between regional accreditation systems, and, when in place, between different Training Funds accreditation systems.

Third, such a fragmented approach could also make it difficult for potential participants / firms to compare different programmes and providers, and thus taking informed decisions on which training providers to choose may be challenging.

Another key challenge is that existing accreditation systems (whether regional or developed by Training Funds) often focus on training providers’ compliance to formal procedures rather than on the actual quality of the training offered.

While quality is subjective and typically difficult to measure, some Training Funds are currently trying to improve their own accreditation system in a view to embed elements of quality. For example, Fondimpresa is planning to adjust its accreditation system so that only training providers that train teachers on a regular basis can receive/keep accreditation. However, this remains quite an isolated practice.

To overcome these challenges, discussions are ongoing on whether Italy should build a national accreditation system – developed and managed by ANPAL – which would develop a set of training quality-standard criteria valid for all training providers. Given the plethora of training providers existing in Italy, the challenge would be to monitor, on a regular basis, that quality criteria are met. The other challenge of adopting a national accreditation system relates to the fact that, at the moment, accreditation of training providers is still under the responsibility of regions.

Perhaps Italy could learn from the experience of OECD countries that have been confronted with similar challenges (OECD, 2019[1]). In France, the Decrée Qualité (adopted in 2017) establishes a set of quality standard criteria to which all training providers need to comply to in order to be able to access funding by training funds (i.e. the OPCA) and other public financing bodies (see Box 5.1).

5.1.2. Funding allocation procedures

Other than accreditation systems, there are also other mechanisms Training Funds can use to ensure that training is of good quality. For example, funding allocation procedures – i.e. the selection process through which funding (from collective accounts) is allocated to training proposals/plans – can have crucial implications for training quality.

Again, there is no standard rule and there are large variations in the practices adopted by the Training Funds. Indeed, some Training Funds (e.g. Foncoop, Fondoprofessioni; Fondimpresa) – in a view to minimise conflicts of interests – outsource assessment procedures to external committees, whose members are identified through a careful selection process. Other Training Funds evaluate training plans themselves.

The methodology used for grant allocation also differs across Training Funds. Training Funds typically have two options: (i) First-come-first-served procedures (i.e.avvisi a sportello”). Training Funds assess whether training proposals meet minimum criteria (previously identified in the grant description) and authorise funding in chronological order of receipt until exhaustion of resource; (ii) Deadline procedures. Training Funds establish a deadline after which training proposals are evaluated altogether and then ranked according to quality criteria previously defined.

The methodology used for grant allocation could have huge implications for training quality. For example, the first-come-first-served procedure guarantees rapid access to funding (see Section 4.5), but puts less emphasis on quality. The deadline procedure allows Training Funds to reward quality (e.g. additional points can be given to training proposals that are innovative, or that draw on SAA exercises to identify firms’ skills needs), but at the same time is probably more time-consuming and therefore it could hamper Training Funds’ ability to respond quickly to firms’ skills needs (see Section 4.5).

Again, Training Funds are free to adopt the strategy they deem most appropriate, reflecting whether they prefer to emphasise responsiveness or training quality. Fondirigenti, for example, solely relies on deadline procedures, with a view to steer quality.

Considering the wide variation across Training Funds in the funding allocation procedures adopted, there may be scope for ANPAL to include relevant regulation in the ANPAL Guidelines (see Section 2.3.3 and Annex A). For example, ANPAL Guidelines could make it compulsory for all Training Funds to rely on an external, independent, committee to assess training plans submitted by firms – as already done by some Training Funds.

5.1.3. Monitoring of training providers

Training Funds monitor training programmes mainly through on-spot inspections, i.e. to assess whether training has actually taken place. After that TF-supported training has taken place, firms must report on their training activities and there is close monitoring by the Training Funds to ensure that training complies with its guidelines. In cases of deviation, the TF may prescribe corrective actions, e.g. withdrawal from the list of accredited training providers, or withdrawal of the funding.

One key challenge is that – as reported by many stakeholders – monitoring by the Training Funds is often only formal (e.g. it is only meant to establish whether the training has actually taken place), rather than focussed on the quality of the training being delivered.

Another key challenge is that Training Funds typically monitor training financed through collective accounts, while there is no obligation for the Training Funds to monitor training financed through individual accounts.

Finally, some stakeholders report that often there is no follow-up discussion between employees and firms at the end of the training programme. More systematic and structured discussions after training would help both workers and firms to better identify the skills developed during training, and understand how to best apply these skills at work. It would also provide a useful feedback for firms on training’s usefulness, effectiveness, and relevance for workers.

Box 5.1. Establishing quality criteria for training providers: the example of France

Since 2014 in France different training financing bodies – including the OPCA, regions, and the Public Employment Service (Pole Emploi) – have collaborated to harmonise quality criteria among all training providers.

The fruit of this collaboration was formalised in the Decrée Qualité2, which was introduced in 2015 and came into force in January 2017. The law establishes a set of quality criteria to which training providers need to comply to in order to be able to access financing.

There are a total of 6 criteria – unbundled in 21 different indicators – that revolve around:

  • the objectives of training;

  • the existence of procedural control mechanisms;

  • the adequacy of the pedagogical tools used;

  • the quality of teachers;

  • the accessibility of information on training offer to the general public;

  • the evaluation of training programmes.

A dedicated online platform – the Data Dock ( – allows training providers to register to the system and self-assess against these dimensions. Training providers also need to provide supporting documents that justify their self-assessment for each of the 21 indicators.

A simplified procedure exists for training providers certified by CNEFOP (Conseil national de l'emploi, de la formation et de l'orientation professionnelles). After providing proof of certification on the Data Dock, these training providers are exempt from providing documentations in the self-assessment.

Each financing institutions (including OPCA) can chose among the training providers validated in the system and select the training providers they wish to work with.


5.2. Strengthening the skills certification system

Skills certification is a formal process through which skills and knowledge (obtained either through formal, non-formal, or informal learning on the job) are validated. Skills certification has several benefits: it can make the outcomes of training participation more visible, transparent, and easily signalled to employers; and may provide further incentives for workers to participate in training.

In recent years, Italy has made significant efforts to create a skills certification system that can be applied consistently across the country. A number of norms3 have been put in place since 2013 to move from Regional Qualification Frameworks (Quadri Regionali degli Standard Professionali) towards a National Qualification Framework (Quadro Nazionale delle Qualificazioni) (OECD, 2017[2]). Today the NQF – officially instituted in January 2018 – applies common skill taxonomy to all regions by creating correspondences between qualifications defined in different RQFs.4

Targeted efforts have also been taken to specifically strengthen the role that Training Funds can play in promoting skills certification practices in Italy. In 2012, the so-called Fornero Law (Law 92/2012) identifies Training Funds as potential active players in the Italian skills certification system (Casano et al., 2017[3]).

More recently, the Guidelines instituted by ANPAL in 2018 (see Section 2.3.3.) take further concrete steps to promote the certification of skills obtained through TF-supported training. In particular, the Guidelines adopt a learning outcomes approach, as training plans need to explicitly mention what skills/competences will be developed in order to be eligible for Training Funds funding. Rather than making skills certification compulsory for all TF-supported training – which would be costly and time consuming –, this approach aims to create the conditions for making skills certification easy and possible at any time.

Irrespective of these government’s directives, Training Funds (see Box 5.2) and social partners are also taking independent steps to encourage skills certification practices. In 2010, the largest employers’ organisation operating in Lombardy (Assolombarda), and the largest trade unions (CGIL; CISL; UIL), in collaboration with the Lombardy Region, have joined forces to encourage the certification of skills acquired through TF-sponsored training. In the context of this initiative, social partners have committed to have at least 5% of training plans5 certified, for example through the regional certification system. As a result of this initiative, between 2012 and 2017, around 17 000 people received Training Funds-sponsored training that led to a certification (Assolombarda, CGIL, CISL, UIL, 2018[4]).

Table 5.2). This drop was mainly driven by an expansion in certifications delivered by regions, training providers and Training Funds.

Despite the progress made, a large share of TF-supported training still remains uncertified. Moreover, even when certification takes place, it is often conducted by different bodies – training providers or Training Funds (42.7%), regions (12%), or other certification bodies (7.2%) – potentially leading to a fragmented system (

Table 5.2). This leaves room to extend certification practices and harmonise the skills certification system further, building on the recent positive developments.

Table 5.2. Certification of skills acquired through TF-supported training
% of participants in TF-supported training, by year




No certification




Regional certification




Training provider or TF certification




Other certification








Note: The question provides multiple answer choices. The sum of the individual answers is used for the calculation of shares.

Source: (ANPAL, 2018[5]).

Placing Italy in the international context, evidence from other OECD European countries show that there is large room for improvements. Elaborations of the European Adult Education Survey 2016 show that in Italy only 35% of adults participating in non-formal job-related training received a certificate upon completion, compared to an average of 46% and much below countries such as Poland (69%) and Lithuania (72%) (Figure 5.1).

Figure 5.1. Certification of non-formal learning activities, Italy and European countries, 2016
% of participants in non-formal job-related learning

Source: Adult Education survey (AES)

There are many reasons as to why TFs-supported training often continues to remain uncertified, despite all the efforts undertaken by the government, Training Funds, and social partners in recent years. First, in the view of many stakeholders, there seems to be a fundamental incompatibility between regional certification systems and TF-supported training. Indeed, regional certification systems are ill-suited for adults’ continuous learning, as they were originally conceived for initial education. For example, regional certification systems typically certify competences associated with “low-skilled” professions, while some categories of high-skilled professions (e.g. in the bank and insurance sector) remain uncovered.

Second, the duration of TF-supported training is typically too short to lead to a full regional certification. Indeed, while legislation allows regional certification for a minimum of twenty-four hours of training, 81.8% of TF-supported training programmes last less than that (see Table 5.3) (ANPAL, 2018[5]).

Table 5.3. Number of hours of TF-supported training

% of training programmes

Cumulative %

Up to 8 hours



From 8 to 16 hours



From 16 to 24 hours



From 24 to 32 hours



From 40 to 48 hours



From 48 to 56 hours



From 56 to 64 hours



From 64 to 72 hours



From 72 to 80 hours



Over 80 hours






Note: Data includes training programmes approved in 2015 and started in 2016.

Source: ANPAL’s elaborations based on the Nexus Database.

Third, TF-supported training plans can cut across (and be delivered in) different regions (e.g. in the case of sectoral-level training plans) – making it difficult to effectively use regional certification systems. While the NQF creates a national “taxonomy” which links RQFs with one another, stakeholders claim that correspondences are not always ensured.

Finally, there also seems to be a widespread negative perception around skills certification in Italy, which may discourage workers and firms from undertaking certification procedures. Indeed, skills certification seems to be still considered as a low-status practice, reserved for the low-skilled and/or for workers in low-skilled occupations.

Overall, while it seems unrealistic to expect that every single TF-supported training programme is certified – not least because skills certification takes time, is costly, and certification systems need to be updated frequently to reflect changing skills needs in each profession – more can still be done, building on the significant efforts undertaken in recent years by different actors.

On top of continuing to implement ANPAL Guidelines, going forward it will be important to develop complementary policies with a view to strengthen certification practices further. For example, it could be envisaged to set a minimum training duration, so as to ensure that the (short) length of training courses does not compromise certification. Moreover, each training plan could explicitly refer to a specific skills certification system (e.g. regional; TF certification).

Box 5.2. Good practices in skills certification: examples from Training Funds

Certify skills

In 2010, the Fondo Banche e Assicurazioni (FBA), the Training Fund for the banking and insurances sector (hereafter FBA), developed a Commercial Banks Qualifications Inventory. The initiative reflected the inadequacy of the regional certification systems to appropriately define the competences of the banking employees, and was introduced with a view to create synergies with the European Qualification Framework (EQF). The Inventory defines a list of job profiles that encompasses four types of information, namely the job’s (i) title (ii) purpose (iii) main responsibilities and activities (iv) and competence profile. In particular, the competence profile lists the knowledge and capabilities required to do the job (Durante, Fraccaroli and Gallo, 2015[6]). The project was led by FBA and the Italian Banking Association (ABI), it proactively involved banks and trade union representatives of the banking industry, and received the technical support of the then-ISFOL (now-INAPP). Building on the Inventory, today FBA is accredited6 to certify 32 different job profiles and offers the possibility to certify skills for free. So far, around 1 000 employees of the banking sector have obtained certification by FBA. Because the Inventory is also linked to the EQF, the certification is recognised in other European countries, and is valid for a period of 3 to 5 years depending on the job profile (Masiello, n.d.[7]).

Reward skills certification

Some Training Funds (e.g. Fondimpresa,, provide higher scores to training plans that foresee skills certification after training completion.

Subsidise the cost associated with certification

Some Training Funds subsidise the cost associated with certification procedures - a practice also implemented in training funds in other OECD countries, e.g. the Netherlands.7 For example, (e.g. through Grant 35) requires that at least 30% of resources are devoted to activities other than training, including workers’ skills certification.

Develop training programmes following a learning outcomes approach

Another good example is the “SMART CARD Competenze”, carried out since 2015 by Fondo Formazione PMI (FAPI). This is a skills certification instrument through which training providers can submit training proposals broken down in “skills units” as defined in RQFs (ANPAL, 2018[5]). A support service system has been put in place to help training providers navigate the new process.

5.3. Fostering a healthy competitive environment among Training Funds

Training Funds are in competition with one another. Their lack of direct linkages to economic sectors8, combined with the principle of “portability”, means that firms are allowed to enrol in the Training Fund of their choosing and/or switch to a different Training Fund virtually at any time. As an illustration, it is interesting to observe how some of the newer Training Funds9 are increasing their catchment area not much by reaching out to firms outside of the Training Funds system, but rather by attracting firms already enrolled in other, older, Training Funds (ANPAL, 2018[5]).10 For example, Fonarcom (established in 2006) is today the second largest Training Fund in Italy for number of firms enrolled, after Fondimpresa (established in 2002).

While in theory, a certain amount of competition is desirable as it may push Training Funds to improve the quality of their services, in practice this does not happen. Many Training Funds try to attract firms by easing administrative procedures rather than by focussing on quality.

There are several reasons as to why this may happen. First, some firms (especially SMEs that struggle to articulate their skill needs) show little interest in training and may pay little attention to the quality of courses offered (see Chapter 3).

Second, in smaller firms, training decisions (including on what Training Funds to choose) are often outsourced to external business consultants (Casano et al., 2017[3]) who, however, are not always fully aware of the firm’s skills needs and may orient their decisions based on administrative (rather than skills needs) considerations.

In the past, weak implementation of portability rules11 and a blurry legal framework12 also played a role in shaping an unhealthy competitive environment, although the Guidelines established by ANPAL in 2018 (see Chapter 2 and Annex A) already take steps to address these challenges. For example, the Guidelines make the rules around portability stricter and clearer, and update the legal framework legislation in a view to make all Training Funds play by a clear, and shared, set of rules.

Going forward, there are additional policy options that the government could explore to promote virtuous competition patterns among Training Funds and ultimately improve training quality. It is worth noting, however, that the international experience is inconclusive in this respect: countries adopts very different approaches, there is no one-fits-all solution and each approach has its strengths and weaknesses:

  • Reduce the number of Training Funds, for example by merging some of the smaller or less performing Training Funds. This option – which has already been proposed by various actors (Casano et al., 2017[3]) – would generate economies of scales, reduce overall administrative costs, on top of giving enrolled firms an opportunity of interaction with a wider platform of firms. On the other hand, however, it may imply looser relationships between Training Funds and firms. In the international context, there is no ideal number of training funds, and practices vary considerably across countries, going from one national training fund, e.g. in Spain, to about 132 training funds in the Netherlands. Some countries, however, have taken steps to streamline the system and reduce the number of funds in recent years. In France, for example, in 2009 the number of training funds was reduced from around 96 to 20 (Podevin et al., 2018[8])

  • Link each Training Fund to an economic sector, bringing the levy back to its original objectives.13 This policy option would make it easier for social partners to agree on a shared training strategy based on common training needs. On the downside, there is a risk to invest solely in skills that are too sector-specific, thereby limiting workers’ cross-sectoral mobility and investments in transversal skills (OECD, 2017[9]).14 Again, different countries adopt different approaches. In some countries training funds are linked to economic sectors (e.g. Belgium, Ireland, the Netherlands, South Africa), in other countries they are cross-sectoral (e.g. Finland, Greece, Hungary, Poland), while other countries have both types of training funds coexist (e.g. Norway, France).

  • Set minimum operational standards, against which each Training Fund could be regularly assessed in order to continue operating. Minimum standards could include quantitative indicators (e.g. number of workers served or amount of EUR collected) as well as qualitative ones. However, quality is typically harder to define and certainly more difficult to monitor. The international experience shows that while some countries do set minimum operational standards – e.g. France, where each training fund (OPCA) needs to collect a minimum amount of financial resources (EUR 100 million) in order to continue operating – this is not systematically done in all countries with a levy scheme in place.

5.4. Strengthening information systems

Collecting information on the activities of the Training Funds is fundamental to monitor who is benefiting from training and who is being left behind, understand where resources are being invested, and keep track of progress.

The Nexus database is the ad-hoc information system conceived to regularly collect data on the activities of the Training Funds. Operative since 2008, the Nexus database was previously managed by INAPP and is now under the aegis of ANPAL.

It collects valuable information on a regular basis (every six months) from all Training Funds. The information collected mainly focusses on training plans (e.g. number of participants, objectives of the training, financial resources allocated), individual training programmes (e.g. thematic areas, modality of training delivery, certification), as well as the beneficiaries of training (firms and workers).

The data collected through the Nexus database are disseminated in a statistical report on continuous learning (Rapporto Sulla Formazione Continua) – which is published on a yearly basis by ANPAL.15

Despite being an essential tool to ensure that standard information is collected across all Training Funds on a regular basis, the database has several limitations that need to be addressed:

  • No information is collected at the individual (worker) and firm level (the unit of analysis is the individual training plan). Designed this way, the database can generate double counting, making it difficult to extract the actual number of participants or the number of firms involved in training, or to analyse training participation dynamics by individual and firm characteristics (Casano et al., 2017[3]).

  • Information feeds into the database in scattered ways. For example, information sometimes refers to previous semesters/years. 16 This can significantly bias the results (and explain why data sometimes vary significantly from one year to the next) and ultimately undermine data elaborations (ANPAL, 2018[5]).

  • Some aspects of the activities of the Training Funds are currently overlooked. For example, there is no information on the quality of training provided, or on the financing channels used to sponsor training (e.g. collective vs individual accounts) (Casano et al., 2017[3]). A similar challenge is that data is collected at a high level of aggregation. To give one example, while the database collects information on certification, it only captures whether the certification has taken place and by what entity it was carried out (e.g. region; Training Fund; training providers; or other private entities) (see section 5.2). Going forward, information could be collected at a higher level of disaggregation, for example on the type of competences certified. To give another example, there are only 14 “thematic areas” included in the Nexus database. Going forward, the number of thematic areas could be increased. This would allow Training Funds to be more precise on the type of skills their training aims to develop; and at the same time will allow policy makers/researchers to better understand of what skills are being developed through Training Funds.

  • There is limited communication with other existing information systems. For instance, the Nexus database does not communicate with regional information systems, or with the stand-alone information system developed by Forma.temp – the Training Fund dedicated to temporary agency workers.

The government is already taking steps to address some of these challenges. ANPAL is planning to create an integrated information system on continuous learning, which should collect information on individuals’ training financed either entirely or partially by public resources (e.g. by Training Funds; regions), and will be able to follow workers among different training programmes, jobs and employment statuses. Not only would this integrated system shed further light on workers’ training attitudes, but it would also allow policy makers to monitor co-ordination and duplication of training actions.

Another step in the right direction is the new obligation – imposed through the ANPAL Guidelines (see Section 2.3.3) – that Training Funds provide anonymised information at the individual worker level. This information will significantly strengthen the data elaboration capacity of the Nexus database and ultimately improve monitoring.

Going forward, in order to improve the current information system and make it more responsive to users’ needs, it will be important to establish more systematic opportunities for dialogue between the different actors involved in the Nexus database, including Training Funds, ANPAL, INPS, regions. One option worth exploring would be to establish a forum for discussion managed and led by a National Observatory for Adult Learning (see Section 6.1).

5.5. Evaluating the impact of training on firms and workers’ performance

Impact evaluation analysis can shed light on the benefits associated with training, e.g. on firm’s performance, or workers’ wages. The results could encourage firms to provide more training to their employees, and increase workers’ incentives to participate in training17. Impact evaluation could also help Training Funds redirect resources to the most effective interventions, which is crucial in the context of tight budgets (see Chapter 2), as well as allow Training Funds to scale-up successful pilots and terminate unsuccessful ones.

Training Funds can directly evaluate the quality of the training they sponsor, for example by asking training participants and firms to express their views on the effectiveness and usefulness of training. Fondimpresa, for example, has established a research centre that evaluates the impact of training on various aspects of workers’ and firms’ performances, both at the national and regional level. For instance, the evaluation sheds light on participants’ satisfaction and perceptions of the usefulness of training. The research centre also conducts surveys of firms’ satisfaction, e.g. looking at whether training has improved workers’ skills, efficiency, output quality, and ability to integrate the work environment (Fondimpresa (2016[10]); (2018[11]).

Training Funds can also outsource the impact evaluation analysis to external, independent, evaluators. At least three examples can be highlighted:

  • Fondirigenti has recently promoted a study – conducted by the University of Trento, and published by the “Industrial Relations” review of Berkeley University – on the returns to investment in training for middle-managers on firms’ performance. The study shows that, among medium and large firms, a 1 percentage point increase in training hours leads to an increase of total factor productivity of about 0.12%. Results for small firms are positive but not statistically significant (Feltrinelli, Gabriele and Trento, 2017[12]).

  • in cooperation with the Sapienza University has started a process of evaluation of one of its grants (i.e. Grant 35). The evaluation – qualitative in nature – aims to identify the factors that have favoured (and hampered) the effectiveness of the grant, and whether the grant has obtained its objectives. The research (conducted between 2017 and 2018) represents a first evaluation model that will pave the way for future and more systematic evaluation analysis conducted by the Training Fund.

  • Fondimpresa, in collaboration with INAPP, has started a pilot project to evaluate the impact of learning on participants and firms. The pilot collects quantitative data and case studies on firms that have benefited from training financed by Fondimpresa in the period 2016-2017. The pilot project will be scaled up and cover a larger number of firms through online platform questionnaires.

Notwithstanding interesting initiatives autonomously undertaken by Training Funds, various observers agree that what is still missing in Italy is a systematic effort to evaluate the impact of sponsored training, conducted on a regular basis across all Training Funds, and preferably by an independent and objective research body.

Indeed, while the above-mentioned initiatives undertaken by Training Funds are extremely important to shed some light on the benefits of learning and the effectiveness of training programmes – they are still conducted on an ad-hoc basis. Moreover, they may create a fragmented picture where different aspects of learning are evaluated, and results are rarely comparable with one another.

Italy is not alone in this challenge. Rigorous evaluations of the effectiveness of levy programmes are extremely uncommon among countries that have training funds in place. Analysis are mostly limited to comparison of outputs and targets (e.g. related to the number of persons trained) (Müller and Behringer, 2012[13]; Cedefop, 2008[14]; UNESCO, 2018[15]).

However, some useful international good practices exist. In Ireland, for example, an independent evaluation of the activities of Skillnet programmes is carried out every year. The evaluation, which is supported by extensive new primary research, stakeholder engagement and detailed analysis, includes a particular focus on assessing the alignment of activities with the requirements of the National Training Fund, including the need to ensure value for money in the utilisation of public resources (Indecon International Economic Consultants, 2017[16]).

What is encouraging is that Italy already possesses the technical capacity to conduct systematic impact evaluation analysis, e.g. INAPP and/or ANPAL would be well placed to take on this function. Moreover, Italy could also build on the experience of the European Social Fund, which is very demanding in terms of ex-ante, in itinere, and ex-post evaluation, and for which a robust monitoring procedure has been put in place to carefully assess the outcomes of programmes.18

Concerted efforts to go in this direction will be crucial in the future. A National Observatory for Adult Learning (see Section 6.1) could coordinate or at least support a more systematic effort to evaluate and assess the effectiveness of TF-supported training.


[5] ANPAL (2018), “XVIII Rapporto sulla Formazione Continua”, (accessed on 28 March 2018).

[4] Assolombarda, CGIL, CISL, UIL (2018), Il Lavoro a Milano.

[17] Autorità Garante della Concorrenza e del Mercato (2016), Bollettino, (accessed on 9 May 2018).

[3] Casano, L. et al. (2017), “Bilateralità e formazione”, (accessed on 7 February 2018).

[14] Cedefop (2008), “Sectoral Training Funds in Europe”, CEDEFOP.

[6] Durante, G., A. Fraccaroli and G. Gallo (2015), Manual for the Certification of Commercial Banks Qualifications According to the European Qualifications Framework (EQF) Principles, Fondo Banche Assicurazioni, (accessed on 9 May 2018).

[12] Feltrinelli, E., R. Gabriele and S. Trento (2017), “The Impact of Middle Manager Training on Productivity: A Test on Italian Companies”, Industrial Relations,

[11] Fondimpresa (2018), Rapporto sulle attività di monitoraggio valutativo: anno 2017.

[10] Fondimpresa (2016), “Rapporto sulle attività di monitoraggio valutativo - anno 2016”, (accessed on 30 March 2018).

[16] Indecon International Economic Consultants (2017), Evaluation of Skillnets, (accessed on 27 June 2018).

[7] Masiello, A. (n.d.), Professione bancario: valutazione e certificazione delle qualifiche, (accessed on 9 May 2018).

[13] Müller, N. and F. Behringer (2012), “Subsidies and Levies as Policy Instruments to Encourage Employer-Provided Training”, OECD Education Working Papers, No. 80, OECD Publishing, Paris,

[1] OECD (2019), Getting Skills Right: Future-Ready Adult Learning Systems, Getting Skills Right, OECD Publishing, Paris,

[2] OECD (2017), Getting Skills Right: Italy, OECD Publishing, Paris,

[9] OECD (2017), “OECD Skills Strategy Diagnostic Report: The Netherlands”, (accessed on 31 March 2018).

[8] Podevin, G. et al. (2018), Transformation des OPCA au fil des réformes récentes : vers un nouveau modèle économique ?, CÉREQ ÉTUDES.

[15] UNESCO (2018), Funding skills development: The private sector contribution, (accessed on 26 June 2018).


← 1. For example, the quality of the university system is monitored by ANVUR (National agency for the evaluation of the university and research system). The quality of Provincial Centres for Adult Education (CPIA) is monitored by INVALSI (National institute for the evaluation of the education system).

← 2. Decree n° 2015-790 of June 30th 2015

← 3. The Law 13/2013 and the Inter-Ministerial Degree 30 June 2015.

← 4. See the Atlante del Lavoro e delle Qualificazioni:

← 5. Expressed in number of training hours.

← 6. By Accredia – the Italian Accreditation Body (Ente Italiano di Accreditamento).

← 7. This is what happens, for example, in some OECD countries – e.g. the Netherlands (OECD, 2017[9]) - where certification procedures are directly financed by training funds.

← 8. Today, in fact, most of the Training Funds are delinked to a specific economic sector. There are some notable exceptions: For.Agri (agriculture sector), Fondo Banche e Assicurazioni (bank sector), Fond.ER (religious bodies) and Fondoprofessioni (professional firms).

← 9. I.e. Training Funds created after 2008.

← 10. These “new” Training Funds are becoming increasingly important: from 2011 to 2016, firms’ enrolment rates passed from 12.2% to 33.7% (Casano et al., 2017[3]). And while the share of resources managed by the new Training Funds is limited – reflecting the fact that they attract many SMEs that contribute little due to a smaller workforce – it is still increasing: from 2014 to 2017 the share of resources managed by these “new” Training Funds grew from 8.9% to 13.5% (ANPAL, 2018[5]).

← 11. The principle of portability – which in theory should pave the way for a healthy competitive system – was not working well, at least before the implementation of the ANPAL Guidelines. Indeed, many firms are not eligible. The rule does not apply to firms with less than 50 employees and/or firms with an annual turnover of less than EUR 10 million, and applies only when the firm can transfer at least EUR 3 000), and the rule is weakly implemented. As already highlighted by the Italian Competition Authority (AGCM), in the past years some Training Funds have tried to obstruct or delay portability, for example by developing ad-hoc internal rules which in practice limited implementation (Autorità Garante della Concorrenza e del Mercato, 2016[17]). As other examples: some Training Funds have missed the deadlines normally permitted by the law to follow up on firms’ request for mobility. In other occasions, Training Funds have burdened firms with additional requests for documentation, as a way to delay portability.

← 12. The blurry legal framework which has regulated the activities of the Training Funds for around 15 years favoured the use of practices of ambiguous legitimacy. For example, in 2016 the Italian Competition Authority (AGCM) pointed to the lack of transparency and accountability in the way some Training Funds allocated funding (Autorità Garante della Concorrenza e del Mercato, 2016[17]).

← 13. The article 48 of the 388/2000 Law contemplated the possibility to create Training Funds for each economic sector of industry, agriculture, and tertiary services and manufacturing (Casano et al., 2017[3]).

← 14. Indeed, one key challenge in the Netherlands is that training funds do not sufficiently support the reallocation of skilled workers across sectors and develop skills that are too firm-specific.

← 15. Until 2017, the report was managed by ISFOL – now-INAPP.

← 16. The late registration of certain training plans into the information system can be due to two main reasons: (i) data feeds into the information system only once training plans are concluded, even for multi-annual training plans which are typically concluded years after their approval; (ii) information on training plans is sometimes corrected after quality checks, in a view to improve the qualify of the information system.

← 17. This is particularly important in the Italian context where – as discussed in Chapter 3 – there is little information available on the benefits of learning, and many workers – especially the most disadvantaged – lack motivation to participate.

← 18. Some local authorities have established Regional Offices to monitor and assess Labour and Vocational Training Policies. For instance, the Istituto Regionale Programmazione economica della Toscana (IRPET) carries out periodical evaluation of labour market and vocational training policies in Toscana Region.

End of the section – Back to iLibrary publication page