This chapter discusses current challenges related to monitoring and evaluation of digital education, including the extent of penetration of digital technologies, their impact on learner outcomes, and the effectiveness of digital education policies. It also outlines some promising ways forward in building national monitoring and evaluation frameworks for digital education, including establishing frameworks linked to digital education strategies, and building on existing national evidence development activities.
Shaping Digital Education
9. Monitoring and evaluation of digital education
Abstract
Introduction
As explained at the outset of this report, digitalising education systems is not a goal in itself but it is a valuable tool than can help to enhance quality, equity and efficiency in education. Ensuring that digitalisation policies meet these needs requires information on the progress of education digitalisation and its impact on desired education outcomes. In this context, this chapter discusses how governments can best monitor the state of digitalisation in education and evaluate the effects of their policies across all dimensions of digital education.
For the purpose of this chapter, monitoring is understood as the systematic collection of performance data that can be used to track the progress of policies and the achievement of policy objectives in order to identify relevant system challenges and weaknesses. Policy evaluation, on the other hand, is the structured assessment of the design, implementation and results of a specific policy intervention and serves for the purpose of accountability and learning about the impact of individual policies (OECD, 2019[1]).
A well-designed monitoring and evaluation framework can act as a helpful guide for policy development and implementation on the use of digital technologies in education. Although countries may pursue different rationales or objectives in their digital education strategies (depending also on their education systems’ state of digital development), a comprehensive monitoring and evaluation infrastructure that is aligned with a country’s strategic vision for digitalisation is key to assess progress towards policy objectives and identify potential implementation challenges.
Substantial information gaps exist in national evidence infrastructures regarding the effective use of digital technologies in education, the presence of the necessary equipment, the human and institutional capacities for digital education and the extent of effective regulation of digital education. These information gaps have emerged for many reasons, including the relatively low policy priority attached to digitalisation in education systems until recently, the difficulty of arriving at a common understanding of and definitions related to digitalisation, and a lack of information on how users are integrating technology into teaching and learning processes (OECD, 2021[2]).
In the light of scarce information, this chapter examines a range of potential sources that governments could use to develop monitoring and evaluation infrastructures such as international indicators and institution-level external evaluation reports. It also highlights several ways in which governments can close information gaps and strengthen the evidence base around digital education. Some of the key questions on this issue that policy makers need to consider include:
What information on the state of digitalisation is currently available along the different policy dimensions of digitalisation in education?
How can governments take a systematic and holistic approach to monitoring and evaluation?
Which existing sources of information could governments draw on to monitor education digitalisation and how can they close current data gaps?
What is the state of evidence on the effectiveness of digitalisation and how might governments go about strengthening this evidence base?
Recent developments and current challenges
Significant information gaps persist on policy progress along most dimensions of digital education
A lack of information on spending on digital education undermines possible cost-benefit analysis
As observed in Chapter 5, little information is available at system level about the extent of investment in digital infrastructure in education. This is in part due to the inadequacy of current accounting and budgeting practices to track expenditure on digital education. These information gaps preclude attempts to link the benefits of digital education to its costs and thus undermine assessments of the efficiency gains derived from digital education. Improving data on investment in digitalisation is a necessary first step to understanding the value delivered by digital education, compared to its cost, but will likely require revisions to national and international accounting standards as well as national approaches to budgeting. Currently, international collaborative initiatives such as the OECD’s Going Digital project are progressing with the necessary technical work to improve the identification of digital activities in statistical data in all sectors of the economy (OECD, 2022[3]).
While Internet connectivity is relatively well-monitored at system level…
One exception to the general lack of information on digital infrastructure relates to Internet connectivity. A policy focus on improving broadband access and connection speeds has led to the development of a range of indicators that measure progress on broadband roll-out and inequalities in connectivity (OECD, 2022[4]). NRENs also routinely monitor and provide information on the connectivity speeds to clients in different locations from their backbone. This means that information on Internet connectivity can be easily included in many national monitoring and evaluation frameworks.
… less is known about the availability and quality of digital equipment within education institutions
Unlike information on Internet connectivity, less data appears to be available at system level about the adequacy and quality (and to some extent the availability) of digital technologies in schools and higher education institutions, including their technical equipment and local area network capacity.
At the school level, there are some examples of data collections carried out by governments in order to assess the availability of digital infrastructure in the school system, and identify gaps:
In England (United Kingdom), a biennial Technology in Schools survey was recently launched to gather up-to-date data to understand the current state, use and spread of technology within primary and secondary schools in England and inform policy making. Findings from the first survey round will be launched in the summer of 2023.
In 2021 Utah (United States) carried out the fourth iteration of its regular School Technology Inventory, which has run since 2015 and provides data on the stock and age of digital devices, hardware and software and teaching resources in every public school district and charter school across the state (UEN and Connected Nation, 2022[5]).
The Flemish Community of Belgium also administers a sample-based survey (targeting about 20% of Flemish schools) to school leaders, teachers and students every five years, which focuses, among other topics, on digital infrastructure (European Commission/EACEA/Eurydice, 2019[6]) (Heymans et al., 2018[7]).
In addition to national surveys, international surveys can provide indications on digital equipment in schools:
The European Commission’s “2nd Survey of Schools: ICT in Education” administered in 2011/12 and 2017/18 provided country-level information on the access to digital infrastructure in schools based on interviews with school leaders, teachers, students and parents (European Commission, 2019[8]; European Commission, 2013[9]). In contrast to the Utah inventory, however, the survey was designed to yield country-level information, rather than to monitor the availability of technology in individual schools.
The International Computer and Information Literacy Study (ICILS) also includes information on digital infrastructure in schools, although the central focus is on students’ digital literacy (Fraillon et al., 2019[10]).
With a broader country reach, OECD surveys such as TALIS or PISA can also provide useful perspectives on digital infrastructure since they ask school leaders about the adequacy of the digital infrastructure and the extent to which shortages or inadequacy of digital resources hinder the provision of quality instruction in their schools. TALIS also asks teachers about spending priorities for the education system, where one response category relates to digital infrastructure. While useful in identifying the presence of specific challenges related to the digital infrastructure, such data do not cover higher education, and their country coverage of the primary and upper secondary levels of education remains limited (OECD, 2022[11]) (OECD, 2019[12]) (OECD, 2020[13]).
At higher education level, there are only few examples of efforts to take stock of digital equipment, particularly in Europe. The long-running National Survey of e-learning and Information Technology in higher education institutions in the United States was first fielded in 1990 by the Campus Computing Project. While it does not directly collect data on infrastructure inventories, it elicits information on the categories of equipment on which technology budgets are spent, and perceptions of the quality of existing digital equipment (The Campus Computing Project, 2019[14]). In Ireland, a 2016 national review of higher education technical infrastructure aimed to develop evidence to support future digital development of higher education institutions, using the Campus Computing survey instrument (National Forum for the Enhancement of Teaching and Learning, 2016[15]). However, the survey was not repeated.
Along with the shortage of data on expenditure on digital technologies, a lack of information on inventories of digital equipment limits the capacity to plan for future public investment in digital infrastructure renewal. From an equity perspective, monitoring of digital equipment is also important to ensure that all students have beneficial exposure to tools that can help to build their digital skills or seize the benefits of digital education. Finally, given that ad-hoc research studies show mixed results regarding he relationship between the use of computers in an education setting and student outcomes (Bulman and Fairlie, 2016[16]), monitoring the availability, use and condition of digital equipment can help to structure and inform future research on its effects.
Research on the effective use of digital technologies in education is fragmented…
Many studies have been conducted on the impact, and to some extent on the cost-effectiveness, of digital technologies. As pointed out in Chapter 1, our understanding of the impact of digital education technologies has evolved as more rigorous research designs allowing causal inferences were developed and performed. While the number of rigorous studies has steadily increased and the COVID‑19 pandemic has brought renewed interest and opportunities to examine the outcomes of digitally enhanced learning activities, there remains substantial scope to explore the mechanisms and uses that enable a positive effect of digital education technologies on student performance and other outcomes.
As shown in the Chapter 1, reviews of a range of studies of the effectiveness of digital education tools and methods show mixed results about their efficiency, quality and their impact on equity, across all levels of education. Information gaps and mixed results are particularly acute when it comes to measuring impact and efficiency gains, which are key to mobilise actors around digital education. In higher education, where evidence on the effectiveness of digital technologies does exist, it is often contradictory, and its robustness is undermined by methodological concerns, such as a lack of rigour in the study design (Cellini, 2021[17]; Bowen, 2012[18]).
Policy makers seeking to construct a monitoring and evaluation infrastructure must first commission or support research to review and broker existing evidence. Such efforts should aim to identify the different types of technology families and in particular, technology uses that have the most conclusive impact and that should thus be supported within education settings, and subsequently followed within a monitoring framework. More general investment in and funding of novel research focused on the use of digital technologies in education settings but also of innovative evaluation methods (e.g. that leverage the advantages offered by digital technologies for collecting and analysing data more rapidly) should accompany these efforts.
...and may not reflect the latest technological developments
In a fast-changing technology environment, evidence on the effectiveness of the usage of some smart technologies’ risks becoming rapidly outdated (OECD, 2021[19]). This is a particular challenge, given the time lag involved in developing evidence: national and international survey instruments take years to develop, test, field and analyse. This concern is also reflected in the 2018 review of the PISA ICT background questionnaire which recognises the need to update and adapt questionnaires at each cycle, as the rapid evolution of technology may render some questions irrelevant very quickly. For example, the PISA 2018 background questionnaire contained items on students’ use of portable music players, and memory sticks, technologies that had already been largely replaced by streaming music on smartphones and cloud storage of electronic files (Lorenceau, Maric and Mostafa, 2019[20]). To address the rapid obsolescence of questions, survey developers tend to phrase questions in broad and generic terms, which makes it more difficult to follow the take up of technologies at a very detailed and granular level.
The strong inertia of survey tools once evidence is collected is also an obstacle to capturing the latest technological developments. Indeed, once questions related to digital education are included in the questionnaire of a large-scale recurrent survey, policy makers of participating countries are typically interested in keeping them to allow for the monitoring of progress and trends over time. Accordingly, survey content often reflects a greater concern for trends than for relevance and coverage of emerging issues.
Likewise, many applications at the frontier of education technology are not yet established enough to permit definitive conclusions about the effectiveness in the teaching process or the viability of their use at scale. Survey instruments can only shed light on technologies after they have been adopted, rather than as they are emerging. If governments aim to have education stakeholders more deeply entwined in the improvement of education technology, then information is needed on emerging technologies as they are being established, not after they have been adopted (OECD, 2021[21]).
There is, therefore, a need for the co-existence of different monitoring tools to get an accurate picture of digital education take-up. Large-scale surveys are useful to monitor long-term trends, with the caveat that they may not always be reflecting the latest technological developments and their content needs to remain very general or shall quickly become obsolete. To address this issue, smaller-scale surveys more focused on emerging technologies, their take-up and impact can provide useful complements. However, they are by nature likely to be more volatile in terms of content areas and, as a result, less useful for monitoring trends over time.
Evidence on the capacity of institutions and education staff for digital education is more developed, at least at the school level
The many potential benefits of digital education (see Chapter 1) have provided impetus to a significant expansion of digital education infrastructure in schools and HEIs. However, as infrastructure barriers have been reduced and technological possibilities for digitally enhanced teaching have expanded, the digital competence of teachers, institutional leaders, administrators and their capacity to put these technological tools to use have emerged as central challenges (OECD, 2019[22]; OECD, 2021[23]). Schools’ and teachers’ capacity for digital education has thus emerged as a key enabling factor to ensure the wide adoption and spread of digital technologies in education, and to realise their full potential, as described in detail in Chapter 7. Accordingly, schools and teachers’ capacity has been a strong area of focus for studies and surveys dealing with digital education.
Against this background, the first and second “European Surveys of Schools: ICT in Education” (ESSIE) have provided a formidable vehicle and a wealth of indicators to monitor developments and trends in relation to the spread of digital technologies in European school systems and teachers’ digital education activities and engagement in professional development related to digital technologies (European Commission, 2019[8]; European Commission, 2013[9]).
Another key source of evidence in this area is the OECD Teaching and Learning International Survey (TALIS), which surveys teachers and school leaders from primary to upper secondary schools and, in an adapted format, early childhood education institutions. Among other topics, the use of ICT in teaching has been a consistent and growing area of focus for TALIS since its first round in 2008 (OECD, 2009[24]) (OECD, 2015[25]) (OECD, 2020[26]) (OECD, 2021[27]) (OECD, 2022[28]). TALIS’ deliberate focus on teachers and their development makes it a natural vehicle for monitoring the existence of an enabling institutional and human infrastructure and the direct surveying of teachers and school leaders – rather than students – makes it a reliable source to assess training needs and capacity issues.
A number of other international surveys or assessments, including TIMSS, PIRLS, PISA and ICILS provide further data with some elements of digital education. Yet most of these existing international surveys or assessments – including TALIS – rely on teachers’ (or school leaders’) self-reports and perceived efficacy as a proxy for the actual digital education skills of teachers, and their schools’ capacity for digital education. Evidence on educators’ actual skills in integrating digital technologies in teaching is needed for a more accurate overview of capacity constraints.
Information on the status of data protection and cyber security related measures in schools is scarce
Chapter 4 discussed the growing cyber threats faced by education systems as well as the bigger responsibility digital education brings regarding protecting student data. While there have been some efforts to increase awareness of schools regarding these topics and inform the relevant players about necessary actions, little has been done to monitor the implementation of measures regarding cyber security and data protection.
With respect to cyber security, there are some examples of third-party reports on the state of risk exposure of schools. For instance, a report on “Cyber Security in schools” in the United Kingdom has recently been released by a collaboration between university, charity and private sector players (University of Kent, SWGfL and Bitdefender, 2022[29]). With respect to data protection, schools in OECD countries that are part of the European Union face strict reporting hierarchies as part of the GDPR. Schools have to appoint a Data Protection Officer who is responsible for reporting data breaches to the Information Commissioner’s Office. Yet, there is no system-wide information available on implementation of data protection measures in schools.
Even where monitoring and evaluation is conducted, it is often focused on specific initiatives and programmes, rather than embedded in a systematic approach
Careful monitoring and evaluation are crucial for supporting innovation in the use of digital education technologies (Redecker et al., 2017[30]). In 2017, a Joint Research Centre review of the design of digital education policies in Europe acknowledged that the integration and innovative use of digital technologies in education had become a policy priority across Europe but found that most reforms either did not have an associated monitoring and evaluation process, or monitoring was tied only to the implementation of the specific programme. The recent UNESCO guidelines for ICT in education policies and masterplans also emphasise the importance of monitoring and evaluation and their role in enabling an iterative approach to policy making where the success of previous measures informs future decisions (UNESCO, 2022[31]).
Outside of evaluating specific policies during or immediately after their period of implementation, more systematic and persistent attempts to monitor and evaluate digitalisation are still rare, although more recent evidence shows promising developments in this area, at least for school education, as discussed later. Table 9.1 shows that while half of countries covered by a 2018/2019 review of digital education strategies in Europe carried out some form of monitoring and evaluation, few countries indicated that they conducted these activities regularly or had set a clear time frame (European Commission/EACEA/Eurydice, 2019[6]). However, information from an ad-hoc data collection performed by the OECD in September 2022 provides initial signs that some countries are now adopting more regular monitoring and evaluation as part of their strategies for digital education. Annex Table 9.B.1 provides a stock take of national monitoring and evaluation policies.
Overall, despite promising initiatives in some countries, most available data and evidence tends to be based on one-off research studies, or data collections that last only for the lifetime of a particular strategy and policy. There are few examples of recurrent data collections that permit countries to monitor trends or follow outcomes over time, or to use the data collected to model relationships between digital technologies and learning outcomes. The ad-hoc nature of monitoring and evaluation of digital education, often relying on different research designs, also tends to create conflicting evidence on its impact, and limits insight into which technology families and, most importantly, technology uses create the best impact for learners and should be facilitated by school practitioners and policy makers.
Further, monitoring appears more advanced in school education, compared to other levels. In higher education, most countries still face difficulties to systematically measure how much digitalisation is taking place, the ways digitalisation is unfolding and changing the practices of their staff and students, and the impact of digitalisation on higher education performance. The widely observed lack of system-level data on the digitalisation of higher education stems from several factors. These include, in particular:
the low priority – until recently – placed by many governments on monitoring digitalisation in higher education
the difficulty of defining digital higher education given the wide diversity of practices referred to by commonly used terms, such as “e-learning” or “digitally enhanced teaching and learning”, and the increasingly blurred line between different degrees of digitalisation as the use of at least some digital technologies for some higher education activities is now widespread
the need for adequate, and potentially costly, data collection tools to help understand the practices and attitudes to technology of higher education students and staff
Table 9.1. Existence of monitoring and evaluation provisions for digital education across EU countries, 2018-19 and 2022
Monitoring and evaluation |
||
Monitoring and/or evaluation of digital education strategies and policies carried out in the last five years by top-level authorities 2018-2019 |
Existence of monitoring and evaluation provisions for top-level strategies of digital education 2022 |
|
Data collection source Countries/economies |
OECD1 |
|
AustriaC |
Yes (ad-hoc) |
Yes |
Belgium FL FRC GEC |
Yes (regular) No No |
Yes No No |
BulgariaC |
Yes (regular) |
Yes |
Czech RepublicC |
Yes (regular) |
Missing |
Cyprus |
No |
Missing |
CroatiaC |
Yes (ad-hoc) |
Yes |
DenmarkC |
Yes (ad-hoc) |
No |
EstoniaC |
Yes (regular) |
Yes |
FinlandC |
Yes (ad-hoc) |
Missing |
France |
Yes (ad-hoc) |
Yes |
GermanyC |
Yes (ad-hoc) |
Yes |
Greece |
No |
Missing |
HungaryC |
No |
Missing |
Ireland |
Yes (ad-hoc) |
Yes |
ItalyC |
Yes (ad-hoc) |
Yes |
LatviaC |
No |
Yes |
LithuaniaC |
No |
Yes |
Luxembourg |
No |
Missing |
Malta |
No |
Missing |
Netherlands |
Yes (ad-hoc) |
No |
Poland |
Yes (ad-hoc) |
Missing |
PortugalC |
No |
Yes |
RomaniaC |
Yes (ad-hoc) |
No |
Slovak RepublicC |
No |
Yes |
Slovenia |
Yes (ad-hoc) |
Missing |
SpainC |
No |
Yes |
SwedenC |
Yes (regular) |
Yes |
Note: Information from 2019 was taken from the Eurydice report on ‘Digital education at School in Europe’. To update this information, the OECD reached out to the national officials in the Eurydice country units of all EU member states in 2022 and conducted background research on their digital education strategies. Superscript ‘’C’’ in the country column indicates that the information displayed was obtained from national officials. Further information to contextualise the monitoring and evaluation provisions in 2022 is provided in Annex 9.B
Source: Eurydice (2019[6]), Digital Education at School in Europe, Eurydice Report. Luxembourg: Publications Office of the European Union, https://eacea.ec.europa.eu/national-policies/eurydice/content/digital-education-school-europe_en (Accessed on 10 September 2022) and OECD data gathering.
Promising approaches for monitoring and evaluating digital education
There are multiple possibilities for creating a monitoring and evaluation infrastructure for digital education
A key conclusion of Chapter 2 is that the policy ecosystem for high-performing digital education should be centred on a strategic vision; should include mechanisms for effective coordination across policies; and should include feedback loops to permit revision of the strategy. At a national level, this vision is best achieved through a process of systematic monitoring of progress and possible implementation challenges, and wide consultation to agree on the elements of the monitoring framework (OECD, 2013[32]). It also requires improvements in the supply of high-quality data and evidence sources to make the case for reform at the vision-setting stage, and greater efforts to institutionalise monitoring and evaluation practices later on to track progress against the objectives outlined in the digital education strategy.
In the interest of efficiency, national governments can initially assess and draw upon existing national and international frameworks and data sources to monitor and evaluate the implementation of their digital education policies (UNESCO, 2022[31]). These may include administrative data, surveys of student or teacher experiences and perceptions, promotion of institutional self-evaluation frameworks to support self-reflection and improvement of institution-level digital strategies, a digital focus of quality assurance evaluation processes, research projects and findings, and the adoption of frameworks to measure the digital competence of educators.
However, as discussed above, there are substantial gaps in national and international data ecosystems that limit the extent to which investment, use and impact of digitalisation can be measured, monitored or evaluated. In turn, this limits countries’ ability to develop a coherent monitoring and evaluation infrastructure for digitalisation across education systems. A smart mobilisation of existing evidence can already help countries assess the state of the digital maturity of their education institutions against objectives or benchmark their own performance against international education systems. However, many of the gaps will only be filled through new data development or the mobilisation of new sources of data/evidence (e.g. big data).
Thus, creating a comprehensive monitoring and evaluation framework may comprise the adaptation of existing data collection frameworks, the design and development of original data collections and the mobilisation of novel sources of data/evidence. Collection of novel empirical data to inform all elements of digitalisation in education is a demanding prospect, requiring multi-year development processes and substantial financial and human resources, which creates a burden on data providers (e.g. survey respondents or administrators preparing data submissions). A realistic monitoring and evaluation framework will need to account for resource constraints and the reporting burden placed on institutions, and thus use or adapt existing data resources as much as possible, carefully balancing the benefits of new data collections with its associated administrative and financial costs.
Develop a national framework for monitoring and evaluating digital education…
The measurement of the range of activities that comprise the ‘digital economy’ is an emerging area of policy concern across all economic and social sectors. Many such activities have focused on assessing the extent of adoption of digital technologies in private business and industry. However, there is an increasing impetus on governments to monitor the social impact of digital technologies, and the extent to which digitalisation is supporting social goals and transforming government services (OECD, 2020[33]). Measurements related to education and skills are often considered as foundational enabling factors for all digital policy dimensions. At the same time, education-related indicators integrated into wider digitalisation monitoring frameworks tend to focus on the supply of human capital for labour markets and the wider economy. To date, little emphasis has been placed on systematically measuring and monitoring digitalisation within the education sector.
However, policy makers are increasingly aware of the need to measure digitalisation within their education systems to ensure accountability and enable evidence-driven policy making. In schools, a growing number of education systems work on developing a stronger evidence base on the penetration and impact of digital education, and evaluate the effectiveness of different digital pedagogical approaches, learning resources or tools. These new initiatives are often linked to the creation of digital education strategies. For example:
The Schools Digital Strategy of New South Wales (Australia) provides an example of a comprehensive and co‑ordinated digital education strategy that puts forward a vision for digital education, and proposed actions co-designed between the government, school leaders, teachers and parents. The digital strategy also acknowledges the need to track outcomes, by measuring how schools are improving their digital maturity, and to gauge the most effective approaches to digital education. To do so, the strategy envisions facilitating access to education data by policy makers to analyse which digital pedagogies, teaching resources, learning approaches, tools and techniques deliver the best learning outcomes (Department of Education, Australia NSW, n.d.[34]).
In Italy, the 2022 School Digitalisation Plan “Piano Scuola 4.0” foresees the implementation of two key actions: ‘next generation classrooms’ and ‘next generation labs’. While the prior project concerns the creation of a digital learning environment in classrooms, the latter focuses on strengthening students’ skills in areas as robotics, AI or coding. Implementing schools will undergo monitoring activities every 6 months that include the collection of qualitative and quantitative data on the progress of the implementation, outputs, and outcomes of the projects. These data points will be compared against schools’ performance on the national evaluation system and will be published on an online dashboard (Ministry of Education Italy, 2022[35]).
The pilot national e-Schools digitalisation project in Croatia developed a concept of “levels” of digital maturity. The levels are intended to indicate the initial extent of maturity in schools, monitor their progress as investments were made in digital technologies (including network connectivity, laptops and educational software), and provide a generic assessment of the outcomes of the project in terms of schools’ increase in digital maturity (Balaban, Begicevic Redjep and Klacmer Calopa, 2018[36]). The pilot project covered 151 schools, and evaluations indicated that most schools raised their digital maturity by at least one level as a result of the pilot, and that pilot schools were able to pivot quickly to remote instruction during the COVID‑19 pandemic. A second phase of the project, covering all schools in Croatia, is currently underway, and includes an expanded education programme for the development of staff digital competences (Centre for Applied Psychology at the Faculty of Philosophy in Rijeka, 2018[37]).
In April 2022, Ireland launched a revised Digital Strategy for Schools. The strategy is accompanied by Implementation Plans. The first plan will run from 2022-2024 and is intended to develop appropriate oversight and measurement processes and procedures to provide for effective implementation of the strategy. These sources of evidence will inform a midterm review at the end of the first phase, and the next Implementation Plan from 2025-2027 (Ireland, 2022[38]).
There are also some examples of nascent national efforts to monitor digitalisation in higher education systems. For example, recent OECD and EC collaboration with Hungary focused on efforts to define indicators on digitalisation in higher education (OECD, 2021[2]). Similarly, a recent OECD-EC project in Croatia examined the digital maturity of higher education institutions using quantitative and qualitative means (OECD, 2023[39]).
The creation of a national monitoring and evaluation infrastructure for digitalisation in education will require careful consideration and long-term investment in its incremental development. Prior to developing a monitoring framework, national discussions and consultations will be needed to define the specific elements of digitalisation that should be monitored or evaluated, as well as other operational elements like the periodicity of monitoring processes and the assignment of resources to a monitoring and evaluation function. Based on recent OECD recommendations for building capacity for evidence development and policy monitoring (OECD, 2021[40]; OECD, 2020[41]), important steps in the process may include:
Mapping of existing national and international data and evidence, and raising awareness of current available sources of information with stakeholders
Together with stakeholders, undertaking systematic identification of current evidence gaps and likely future information needs, taking into account policy objectives
Establish, ideally through consensus, an agreed list of indicators that should be tracked within a national monitoring and evaluation framework, taking into account existing data availability, the importance of the signal provided by the indicator, and the need for parsimony in a context of finite resources
Evaluation of organisational capacities to design, develop, contribute to, and disseminate new data and evidence gathering initiatives
Agreement on roles and responsibilities within the system for the monitoring and evaluation framework, including evidence gathering, processing and dissemination.
Consider new data development initiatives to systematically monitor the state of digitalisation in education systems
A national monitoring and evaluation framework should also ideally promote new data development related to the state of digitalisation across all sectors of education, including schools, higher education institutions, vocational and adult learning providers.
Austria provides a recent example of a national effort to develop a holistic education monitoring system (EMS), as a basis to assess the impact of policy actions and subsequently adjust policies and implementation. The development followed a stepwise process, first defining goals of the framework, then, incorporating stakeholder engagement, developing an “indicator monitoring plan”. The final step entailed the development of a technical solution to bring together data from disparate sources into the monitoring framework. An OECD analysis found that the EMS design could be further improved by: articulating how the information in the framework should feed into the improvement of learning outcomes; building a stronger data culture; focusing on securing resources at the planning stage, and ensuring that the efforts to develop the framework are compatible and complementary to other ongoing and planned policy initiatives (OECD, 2021[40]).
Japan has also recognised the importance of integrating data more profoundly into education systems, as part of its objective to create “a society where anybody, at any time and place, can learn with anybody in his/her own way”. Following widespread consultation, public authorities have created a roadmap for digitalisation which envisages providing a “big picture” of data in education, through bringing together, enhancing and standardising existing data sources (for example, by adopting international standards into national data frameworks). The first stage of the roadmap entails moving education institutions’ administrative processes procedures and data collections online as much as possible. A second stage envisages using the online platforms built in stage one as a basis to collect and analyse log data from learner devices that can feed into multi-dimensional monitoring and evaluation processes. A third stage could begin to use the data collected to support individually optimised learning and to evaluate progress on academic achievement and non-cognitive skills. The roadmap plan is intended to cover all aspects of the Japanese education system (Digital Agency et al., 2022[42]).
Estonia is another example of a country that has adopted a comprehensive approach to the monitoring and evaluation of its digital education progress. Digital education (with a focus on the digital competences of learners and teachers, digital solutions and learning environments) is addressed as part of a broader lifelong learning strategy implemented through 3-year programmes and monitored annually based on a set of indicators. Further, schools are advised to structure their internal evaluations on activity indicators, among which the frequency of digital technology use in learning and teaching. In addition, Estonia has piloted a low-stake test of students’ digital competences as part of quality assurance procedures. Previous evidence showed that the country also relied on schools’ self-reporting on their digital technology infrastructure, surveys of students, teachers and parents in Estonian schools, as well as an annual report developed by a specialised agency (European Commission/EACEA/Eurydice, 2019[6]). More generally, the strength of the Estonian monitoring and evaluation system lies in the Estonian Education Information System (EHIS) as Estonia has established and maintains a digital, online and encompassing information system that brings together data on schools, pupils, teachers, exams and qualifications (OECD, 2020[43]). The accuracy of the data (due to live data collection), its structure (enabling very fine analyses), the possibility of connecting it to other national databases and its accessibility to the wider public (through the online platform Educational Eye) are among the main strengths of EHIS. Overall, the case of Estonia shows how well-developed information systems contribute to successful digital education governance and policy making (OECD, 2020[43]).
Portugal can also provide a source of inspiration as a country that is leapfrogging its digital transformation (Estevez et al., 2021[44]). Digital education is embedded within a broader digital transformation strategy which is linked to a comprehensive action plan. (Portugal, 2020[45]). ln order to ensure the proper monitoring of the set of programmes and initiatives of the action plan, a monitoring framework was developed, based on a list of about 100 indicators, and in education, an Observatory for Digital Competences, has developed a comprehensive indicator framework measuring trends over time for the selected indicators (Direcao-general de estatisticas de educacao de ciencia, 2020[46]). Further, Portugal launched an online platform in the end of 2021 to report the progress of school digitalisation and to allow continuous data collection. The data is entered into the system by digital ambassadors who work directly with schools in supporting and monitoring digitalisation. The platform was launched at the beginning of 2022 and has already been used to collect data on the status of teacher training and the digital development of schools which is shared publicly on an online dashboard (República Portuguesa, n.d.[47]).
Connect monitoring initiatives to the national vision for digital education and take into account broader social goals connected to digitalisation
Ideally, monitoring and evaluation of digitalisation in education should be based on a national strategic vision of the role that digitalisation should play in education systems, and its intended impact. Indeed, a shared vision on goals can provide a strong foundation for the identification of relevant performance targets and potential indicators for monitoring and performance evaluation, as the examples from Portugal or Ireland illustrate.
The analytical framework presented in Chapter 1 can also serve as a foundation for monitoring and evaluation. It provides a broad, comprehensive and systematic overview of dimensions along which progress in digitalisation can be measured. These dimensions are:
the effective use of digital technologies through adequate pedagogies, curricula and assessment frameworks
the presence of the necessary guidance and a regulatory framework for digital education
the adequacy of funding and procurement mechanisms for digital education
the availability of accessible, innovative and high-quality infrastructure for digital education
the capacity of educators, institutions and at a system level to engage in digital education
the extent to which human resource policies incentivise and empower educators’ effective use of digital technologies.
The impact of policy reforms along these dimensions should then be assessed in terms of access and equity, quality and efficiency.
In progressing with the development of a monitoring and evaluation framework for digital education, governments also need to bear in mind the broader context for the monitoring of digitalisation more generally, and account for the priorities for measurement of the wider digital economy outlined in the OECD’s Going Digital Roadmap (OECD, 2022[3]), namely:
Making the digital economy visible in national accounts/statistics
Understand the impact of digital transformation
Encouraging measurement of the impact on social goals and well-being
Design new and interdisciplinary approaches to data collection
Monitor emerging technologies
Improve measurement of data and data flows
Define and measure skills
Measure trust in online environments.
In operationalising the monitoring and evaluation framework, attention should also be paid to minimising response burden. A national monitoring framework should thus as much as possible take existing data and indicators as its starting point, where they exist, and expand, where possible, through modification of existing data collections. The next section discusses the possibilities of leveraging and building on existing national and international evidence development activities for the purposes of monitoring and evaluating digital education.
Leverage and build on existing sources of evidence for the development of national monitoring and evaluation infrastructures
A national monitoring and evaluation framework can draw upon several evidence streams, as described below. Strategies for building an evidence infrastructure for digital education can include adding a “digitalisation lens” to current national administrative and statistical data collections, expanding and repeating previous one-off surveys, incorporating internationally comparative indicators, and making greater use of qualitative sources of evidence, such as quality evaluation reports and the results of research studies.
Add a “digitalisation lens” to national administrative and statistical data collections where possible
Administrative data systems, such as student information systems, are widely used by education institutions, and most governments impose common reporting requirements on public and government-dependent private institutions to monitor their activities. These data points are fed into the production of official statistics and passed on to international organisations such as the UNESCO/OECD/EUROSTAT annual data collections. As governments advance on strategic objectives related to digitalisation in education, evidence on some forms of digitalisation may be collected through adaptation of these existing data collections.
The Integrated Post-Secondary Education Data System (IPEDS) in the United States offers an example of how an administrative data collection can be adapted to support indicator frameworks on digitalisation. The IPEDS is a national database on post-secondary institutions in the United States, maintained by the National Center for Education Statistics, part of the United States’ Department of Education Institute of Education Sciences. All public and private institutions (including higher education institutions and many vocational education providers) that receive federal funding are required by law to report their administrative data to IPEDS in aggregate form. Because of its coverage, IPEDS data may be used to generate both institution and system-level indicators related to digitalisation, and to model the relationship between the extent of digitalisation and other institution-level indicators. For example, IPEDS collects data from each institution on their “distance education” activities, defined as “education that uses one or more types of technology to deliver instruction to students who are separated from the instructor and to support regular and substantive interaction between the students and the instructor synchronously or asynchronously” (NCES, 2021[48]).
Table 9.2 shows the items collected as part of the IPEDS annual survey that can be used to routinely monitor students’ enrolments and graduations by mode of delivery, and the extent to which institutions are offering programmes through distance education. In addition to distance education data, IPEDS collects data on institutions’ digital/electronic library resources, including the number of digital/online books, databases, media, and serials. These latter variables could also be used to develop useful measures of the digital infrastructure of institutions.
Table 9.2. Overview of distance education data items collected annually in IPEDS
Indicator |
Data coverage period |
Description |
---|---|---|
Institutional Characteristics (IC) |
Current Academic Year |
Captures whether institutions offer distance education courses and/or programmes for undergraduate and graduate students and whether all programmes are offered exclusively via distance education |
12-Month Enrolment |
July 1-June 30 (prior Year) |
Captures the number of students enrolled in distance education courses over 12-month period |
Fall Enrolment |
Institutions’ official fall reporting period |
Captures the number of students enrolled in distance education courses in the fall term and, of the students enrolled exclusively via distance education, the number in various geographic categories |
Completions
|
July 1-June 30 (prior year) |
Captures whether all, some, or none of the programmes (organised by field of study) and award level can be completed entirely, via distance education, and whether certain distance education programs have on-site components |
Source: National Center for Education Statistics (n.d.[49]), Distance Education in IPEDS. US Department of Education, Available from: https://nces.ed.gov/ipeds/use-the-data/distance-education-in-ipeds
As well as the United States, some other governments across OECD countries have also started to collect data on participation in higher education by mode of delivery:
In Australia, higher education institutions are required to report data on students according to their mode of attendance, classifying them as internal (i.e. campus-based), external (i.e. fully at a distance) or multi-modal (i.e. hybrid education).
In the United Kingdom, the Higher Education Statistics Agency collects data on the mode of study of students enrolled on degree programs and their domicile (United Kingdom or abroad) (HESA, 2020[50]).
A major limitation of IPEDS and some other administrative reporting frameworks is the fact that data is reported only at the institution level. Student-level reporting can substantially increase capacity for monitoring student outcomes according to the mode of delivery of the programmes they are following (Miller and Shedd, 2019[51]). Student-level reporting also opens the possibility of constructing panel data to track and compare the outcomes of students exposed to different patterns of usage of technology. The Integrated Data Infrastructure in New Zealand offers an example of the substantial analytical potential of anonymised individual student records, when linked to other databases such as labour market or social protection records (Jones et al., 2022[52]). The inclusion of a variable for extramural (off-campus) and intramural (on-campus) study allows New Zealand authorities to conduct in-depth analysis of the characteristics and outcomes of students according to their mode of study (Ministry of Education, 2014[53]).
At school level, distance enrolments are less common than in higher education, hence there is less of a need to monitor school participation by mode of delivery. Notwithstanding this, several countries’ education monitoring information systems collect data from individual institutions – or for each student, see below – that can be harnessed to generate information on some aspects of digital education.
As described previously, Estonia has successfully organised its Estonian Education Information System (EHIS) around the individual student, by bringing together different databases on important parts of the education system such as schools, pupils, teachers, exams and qualifications. Its main challenge though is to foster the use of its rich data by schools to promote evidence-based decision-making (OECD, 2020[43]).
Individual level data was also used – although not as part of a country-wide education monitoring information system – to assess the impact of the “e-schools” project in Croatia. The Centre for Applied Psychology at the Faculty of Philosophy in Rijeka conducted a study during the pilot phase of the project focusing on individual level results such as learning outcomes, digital competences, and attitudes towards digital technologies of students and teachers. The study included comparisons of treated and non-treated observations as well as of observations of the same individual before and after the intervention. Data was collected through online questionnaires and digital competence tests (Centre for Applied Psychology at the Faculty of Philosophy in Rijeka, 2018[37]).
Consider expanding and repeating existing national data collections on digitalisation in education
As indicated earlier in this chapter, there are few examples of efforts to monitor digitalisation across education systems. At the same time, there have been national studies in some countries that could be updated and repeated (wherever they are not already administered on a regular basis) to strengthen the national monitoring and evaluation infrastructure.
One of the most comprehensive systemic studies of digitalisation in education was carried out in Germany by the Bertelsmann foundation in 2016/2017. The study has a significant scope, drawing on representative samples from four different education sectors (adult education, vocational education, schools and higher education) across Germany. The study is also distinguished by its focus on the users and usage of technology, rather than on infrastructure. Microdata from the survey were also made available through the German social science data archive, allowing researchers to conduct secondary analysis. However, the study has not been repeated since its first edition (Bertelsmann Siftung, n.d.[54]).
Further, national data collection initiatives touching upon digital education issues can be harnessed to collect information on digital technologies used by schools. This is for instance the case for national surveys implemented in Denmark, Estonia, the Flemish Community of Belgium or Italy as described above (European Commission/EACEA/Eurydice, 2019[6]). In New Zealand, the Council for Education Research also conducts a survey of its secondary schools every three years, which includes a brief section on teaching and learning with digital technology (Bonne and MacDonald, 2019[55]).
Repetition of national data collections of digital education can also support responding institutions to keep statistical and reporting needs in mind when organising internal information and data flows. Canada’s Digital Learning Research Association commenced an annual National Survey of Online and Digital Learning in Canada’s publicly funded post-secondary institutions in 2017. The regular nature of the survey has encouraged higher education institutions to improve their internal tracking of activities to ensure they can more easily report the required information.
In Ireland, the National Forum for the Enhancement of Teaching and Learning, supported by government funding, developed a comprehensive national survey of digital experiences in higher education, with strong involvement of higher education stakeholders in its design and implementation. The Irish National Digital Experience Survey (INDEx) drew responses from more than 30 000 students, teachers, librarians and others across the system, and led to the creation of indicators that assess digital readiness, digital practices and digital performance. Notably, INDEx was adapted from an existing higher education survey used to varying extents in the United Kingdom, New Zealand and Australia – the Digital Experience Insights survey, developed and provided by the UK NREN Jisc.
Every five years, the Flemish Community of Belgium publishes a study on ICT integration in Flemish education “MICTIVO”, based on the results of a web survey conducted in 20% of Flemish schools which gathers the views of school leaders, teachers and students. MICTIVO focuses on four components: infrastructure and policy, perceptions, competences and usage at the micro level, measured through scales derived from exploratory and confirmatory factor analysis (Heymans et al., 2018[7]).
Within individual education systems, many ad-hoc surveys of student and teacher perceptions have been carried out, especially during the COVID-19 pandemic. Due to the specific emergency context of the pandemic, the results of most of these surveys may not have wider applicability/validity. Depending on the specific survey design and development process, such surveys may potentially be repeated or adopted into recurrent surveys of higher education students or staff, fielded either nationally or internationally (for example, through the Eurostudent survey instrument). Examples of recent national or international survey initiatives include:
In Hungary, the Ministry for Innovation and Technology commissioned two surveys on digital higher education in 2020, administered by the Digital Higher Education Competence Centre. A first survey sought information on digital practices and institutional leaders’ views on factors determining the extent of digitalisation in their organisations, including external factors (e.g. students’ digital skills) and internal factors (e.g. access to digital infrastructure at an HEI, teachers’ digital skills, etc.). A second survey collected data on access to digital infrastructure at Hungarian HEIs, including high-speed Internet access and the availability of digital tools.
The European Students Union carried out a survey of student experiences during the COVID‑19 lockdown, which included questions about access to hardware, software and connectivity, students’ perception of the advantages and disadvantages of online learning, and students’ perception of their digital capabilities.
Aim to integrate relevant international indicators into national monitoring and evaluation frameworks
Countries are increasingly interested in comparing their performance on digitalisation with other countries, as part of national monitoring and evaluation efforts (Trucano, 2019[56]). Three distinct categories of international indicators are available: general digitalisation performance indicators, policy indicators, and international surveys or assessments that have elements relevant for digitalisation. Each of these are discussed in turn below.
General indicators of digital performance
Digitalisation is an engine for economic growth, job creation and social connectivity. As such, digital innovation is now a central pillar of all areas of government policy. As the digital economy is growing, a range of measurement frameworks have emerged, aiming to give greater visibility to digital aspects of various economic sectors and the impact of digitalisation (OECD, 2022[3]).
Most existing measurement frameworks aim to assess progress on digitalisation across a broad range of economic and social sectors (Table 9.3 ). A common approach is to develop composite performance indices based on a range of indicators. Some monitoring tools operate on a global scale. For example, the Network Readiness Index by the Portulans Institute ranks 130 global economies on technology development and the ability of countries to capitalise on digital opportunities. It is a composite index based on four primary pillars: technology, people, government and impact. The World Digital Competitiveness Ranking of the International Institute for Management Development measures the capacity of 64 economies on their adoption of digital technologies for transforming government practices, business models and society in general (Portulans Institute, 2021[57]).
In Europe, prominent examples of digitalisation monitoring tools include the European Union’s Digital Economy and Society Index (DESI) – a composite index that monitors broadband connectivity; human capital for digitalisation; integration of digital technology; and digital public services (European Commission, 2022[58]). Another example of a European framework is the Centre for European Studies’ Index of Readiness for Digital Lifelong Learning (IRDLL) which measures three key dimensions: 1) learning participation and outcomes, 2) institutions and policies for digital learning and 3) availability of digital learning (CEPS, 2020[59]).
Table 9.3. Selected international monitoring tools for general digital performance
Name of the Monitoring Tool |
Description |
Country coverage and periodicity. |
---|---|---|
Digital Economy and Society Index (DESI), based on DigComp |
4 key dimensions, covering 37 indicators: 1) human capital (internet user skills and advanced skills); 2) connectivity (fixed broadband take-up, fixed broadband coverage, mobile broadband and cost); 3) integration of digital tech (business digitisation and commerce); 4) digital public services (e-government); 5) use of internet services (citizen’s usage of internet services and online transactions) - dropped in 2021 |
EU countries Annual publication since 2014 |
Institute of Management Development (IMD) World Digital Competitiveness Ranking (WDC) |
4 principal dimensions, covering 334 sub-indicators: 1) economic performance (domestic economy and employment) 2) government efficiency (public finance and societal framework); 3) business efficiency (labour market and productivity); 4) infrastructure (education and technological and scientific infrastructure) |
64 world economies Annual publication since 1989 |
Centre for the European Policy Studies’ Index of Digital Readiness for Lifelong Learning (IRLL) |
This study was carried out as a collaboration between the Centre for European Policy Studies and Grow with Google, and combines conventional indicators with alternative data sources, such as indicators of expert consensus and data from internet searches. It has 3 primary pillars: 1) individual learning outcomes; 2) institutions and policies for digital learning; 3) availability of digital learning |
27 EU member states Published in 2019 |
Portulans Institute - Network Readiness Index (NRI) 3rd edition |
4 key dimensions that make up a composite index: 1) technology (access, content and future tech); 2) people (individuals, business, government); 3) governance (trust, regulation, inclusion); 4) impact (economy, quality of life, SDG contributions) |
130 global economies Published annually since 2019 (Portulans took over the index from the World Economic Forum in 2019) |
Source: Author’s elaborations
In addition to these existing indicators of digital readiness/performance, the World Bank has developed an Edtech Readiness Index (ETRI) which aims to go beyond measuring the availability of devices and the level of connectivity to capture key elements of the larger education-technology ecosystem in a country. The index is organised around six pillars: the first three pillars focus on the actors in the education system (school management, teachers, students), and the last three examine the inputs and infrastructure that the actors need to use EdTech (devices, connectivity, digital resources). For each pillar, the ETRI reports on a practice indicator (to capture the practices at the school level), a de jure policy indicator (to capture whether there is a policy to inform each practice), and a de facto policy indicator (to measure the extent to which the policy is implemented). The ETRI has started to pilot in 2022 with the first surveys already having been conducted in Ho Chi Minh City (Vietnam) (Venegas Marin et al., 2021[60]; Hu’o’ng, 2022[61]). It could thus provide an additional source of evidence in the future on the readiness of various school systems for digital education.
Comparative policy indicators
As well as integrating comparative indicators of performance, policy makers are also interested in comparing their policy frameworks and progress on the digital transformation of education with those of other countries, as a means of comprehending to what extent national policies are aligned with international best practices.
International policy surveys have become more prevalent in recent years. Indeed, many countries contribute information about the characteristics of their education systems and recent reforms to international initiatives such as the Eurydice comparison of education systems (Eurydice, 2022[62]) and the OECD Education Policy Outlook (OECD, 2018[63]). Such surveys provide useful information for countries wishing to learn about reforms in other jurisdictions, or to get a snapshot overview of how their policy framework compares with that of other systems. But in general, policy surveys are not carried out on a regular and recurring basis, limiting their suitability for inclusion in a monitoring framework. Therefore, integrating comparative policy indicators into monitoring frameworks would require new surveys to be designed, or existing qualitative data collections to be adapted and/or repeated.
With these caveats in mind, on the issue of digitalisation of education more specifically, the European Commission, in collaboration with the European Education and Culture Executive Agency and Eurydice, undertook a review of its member states’ digital education state of play and policies in 2018/19 prior to the pandemic (European Commission/EACEA/Eurydice, 2019[6]). In the context of preparing this report, the OECD has worked on updating some of its elements as presented in Chapter 2.
Going beyond the European policy landscape, the OECD Centre for Education Research and Innovation (CERI) is currently collecting qualitative information across OECD countries about the governance of digital education and public-private relations between governments and the industry for educational technology in primary, lower secondary and upper secondary levels of education, including secondary VET. This policy survey is gathering responses from central, state/regional, and local authorities. Currently 26 countries (among which 18 EU countries) have responded or confirmed to respond to the survey. Results from the OECD CERI survey will be released in 2023. In addition, a fixed-response OECD Higher Education Policy Survey was fielded in 2022, collecting evidence on digitalisation at the tertiary level (Box 9.1). Comparative information on digitalisation policies in education will also be collected as part of a new OECD project on “Resourcing school education for the digital age – effective digitalisation and future-ready teachers”.
Box 9.1. Policy indices and the OECD Higher Education Policy Survey
Cross-national indicators that monitor progress on digitalisation in higher education systems have yet to be developed. However, international policy surveys can provide some comparative information on how well the policy environment is adapted to the development of effective digital technologies in higher education systems. For example, the OECD Higher Education Policy Survey is a fixed-response survey instrument used to collect comparative data on higher education policies across OECD member states (and partner countries, candidate OECD members, and other EU members). The 2022 edition of the survey focused partially on digitalisation policies, establishing a baseline set of comparative data on regulation and governance of digitalised higher education and financial and human resources available for digitalisation. Comparative country data based on the survey will be published in 2023.
Fixed-response policy surveys could be a useful vehicle to monitor the policy environment for digital education. The fixed nature of the items limits the cost of participation for responding jurisdictions, and the resulting dataset can be used in many ways to provide insights into policy frameworks governing digitalisation in higher education. For example, there is increasing interest in indices that can be used to monitor and benchmark policy implementation in emerging areas of national and international importance. Recent example of such indices include the SME Policy Index (OECD et al., 2020[64]) and the OECD Digital Government Index (OECD, 2020[65]). The Higher Education Policy Survey data is suitable for the construction of such indices. For example, an index of “policy support for improving digital pedagogy” might be constructed out of responses to questions about academic workload policies and support for professional development, while an index of “support for digital learners” might be constructed out of items on policies with respect to student grant and loan support, virtual student services, and connectivity and hardware support for students.
In future years, questions from the OECD Higher Education Policy Survey on digitalisation could be repeated in order to assess the evolution of policies and monitor trends, extended to cover digitalisation policies concerning other levels of education, and data coverage strengthened through partnership with other international organisations.
International surveys or assessment indicators
A range of international large-scale surveys or student assessments provide other promising sources of evidence for the development of monitoring frameworks on digital education. Indeed, a number of international surveys gather data at various levels of school education, and as they are typically repeated at regular intervals and built on representative samples of respondents, they can provide a valuable source of evidence coming from practitioners on the ground, while yielding estimates of various indicators at the system level.
Large-scale surveys and assessments offer a range of benefits to participating education systems. First of all, the development of the instruments (questionnaires/tests) generates economies of scale as the development costs are shared between a large number of participants. Therefore, they tend to be more cost-effective to develop than national surveys. Secondly, large-scale surveys and assessments harness expertise from around the world, pooling highly specialised expertise in large consortia and having the survey instruments reviewed by experts from multiple countries to foster their validity. Thirdly, they yield internationally comparative indicators which allow countries to not only monitor progress over time, but also get a sense of their state of digital maturity relative to peer education systems.
Yet, large-scale international surveys and assessments also involve constraints, as their repetition over time leads to some inertia in the survey/test content to capture trends over time. They also involve extensive negotiations among countries on the survey focus, and countries may not be able to monitor all aspects of interest to them – although there is usually some flexibility for some country-specific questions. Lastly, these surveys can be higher stakes than national surveys given the development of comparative data – with the risk of ranking interpretation – and typically very strict technical standards which can lead to non-adjudication of the data if a country fails to meet sufficient response rates for instance.
Depending on their nature – survey or assessment – and their target population for sampling – students or teachers – these survey tools will be more or less useful to policy makers in monitoring policies and progress. For instance, policy makers interested in advancing equity goals will be interested in indicators expressed in terms of the percentage of students benefitting from quality digital resources or infrastructure, or on the contrary suffering from shortages in these areas. This is the sampling approach followed by the European Survey of Schools ICT as well as all student assessments (PIRLS, TIMSS, PISA, ICILS). By contrast, policy makers monitoring progress in infrastructure upgrades or capacity building programmes will be more interested in indicators expressed in terms of the percentage of schools with adequate infrastructures or the percentage of teachers lacking specific skills and needing training. This is the sampling approach pursued by TALIS. Accordingly, no single survey will provide the full range of evidence for an ideal monitoring framework, and the combination of evidence from different surveys and assessments can provide richer data for system monitoring and diagnosis. Annex 9.A provides a description of various international surveys and assessments that could prove useful for national monitoring and assessment.
Leverage insights from institution-level external quality evaluations
National frameworks for quality assurance in education may include a range of institution-level quality assurance procedures. Though specific evaluation and quality assurance procedures may vary in their characteristics, schools tend to undergo periodic evaluation by public inspection authorities (OECD, 2013[66]), while higher education institutions are subject to external accreditation (at institution and programme level) by public and private bodies (OECD, 2013[66]; OECD, 2019[67]).
School inspections and external evaluations of higher education institutions often give rise to a formal written report detailing the findings of the evaluator(s). In the vast majority of OECD countries, reports from formal school evaluations are made publicly available (OECD, 2015[68]). Similarly, reports of findings from external quality assurance audits of higher education institutions are made publicly available in many OECD jurisdictions.
Increasingly, inspection reports contain insights into the access and use of digital technologies:
In Slovakia, for example, the State School Inspectorate’s central evaluation framework includes the use of digital technologies for teaching as an explicit criterion for the evaluation of education facilities and resources in schools (European Commission/EACEA/Eurydice, 2015, p. 153[69]).
Likewise, Scotland’s Digital Learning and Teaching Strategy stressed the importance of aligning self-evaluation guidance for schools and school inspection criteria with its vision for digital education. Specifically, Education Scotland committed to ensuring that self-evaluation guidance references the importance of using digital technology to enhance learning and teaching, that inspections include a focus on the effective and innovative use of digital technology, and that inspectors have a sound understanding of effective and innovative uses of digital technology in education (Scottish Government, 2016, p. 29[70]).
In New Zealand, the effective use of digital devices and digital resources for learning is listed as one of the indicators for “Leadership and Excellence” assessed in school evaluations (Education Review Office New Zealand, 2016[71]).
A similar trend is observed in Vocational Education and Training, under the drive of several EU initiatives, with the development of guidelines for quality assurance in e-learning (Vaiouli, 2021[72]).
In higher education, few examples exist of fully developed national standards and guidelines for quality assurance of education delivered online. However, many countries are in the process of revising their existing guidelines, as a reaction to the COVID-19 pandemic, and taking into account the proliferation of online, hybrid and blended programmes. A recent OECD review shows that, as of 2022, 19 OECD countries now have specific guidelines or regulations in place covering online, hybrid or blended learning (Staring et al., 2022[73]). As a result, it is likely that future external evaluation reports will have more information and insight from evaluators on the use of digital technologies within education institutions.
The wealth of information available in individual evaluation reports provides a potentially valuable source of insight into the use, perception and impact of digital technologies within schools and higher education institutions. The qualitative nature of the reports and their lack of structured content has stymied attempts to efficiently gain insights, and in a comparable way. However, in jurisdictions where a common report structure is in place, new meta-analytical and content analysis techniques are opening up possibilities for structured extraction of insights and reflections about digital technologies. For example, a recent large-scale study of school inspection reports in the United Kingdom demonstrated the potential to use automated text mining to complement small-scale manual qualitative analysis (Bokhove and Sims, 2020[74]).
A European database of external quality assurance reports from higher education institutions (DEQAR) has been initiated, which contains more than 75 000 external programme and institutional accreditation reports for more than 3 200 European institutions (EQAR, n.d.[75]). Although the report structures are not comparable across jurisdictions, and the content is not standardised, researchers are beginning to use the database to conduct pilot analyses (for example (EQAR, n.d.[76])). Potentially the DEQAR may yield new insights into the state of digital higher education across Europe.
Support the generation of research evidence on the impact of digital education and promote greater use of research insights
Supporting the selection, suitability and effective pedagogical use of digital technologies requires a good understanding of their impact on student learning and non-cognitive outcomes. As described in the preceding chapters, rigorous evidence on the causal effects of digital education technologies remains sparse. However, policy makers can play an important role in strengthening this evidence base on the impact of digital education technologies. Government statistical agencies can support this effort by investing in data collections that generate descriptive information about the use of digital technologies and combine them with the collection and consolidation of administrative and performance data. Public research funding bodies can invest in research that yields reliable inferences about the causal effect of digital technologies. Decision makers can also promote policy experimentation and pilots and ensure their systematic evaluation (Köster, Shewbridge and Krämer, 2020[77]).
Even where research is available, the results of individual studies are generally specific to a particular context or student cohort type or focus on the presence or use of a single technology. This fragmented research landscape calls for more attention at national levels to assess, curate, and broker available evidence in order to integrate research results into the development of policy and practice. However, the state of knowledge mobilisation for decision making is education is often considered underdeveloped compared to some other fields, notably the health sector, although the number of intermediary organisations aimed at mobilising and communicating research evidence has increased over the last two decades (Torres and Steponavičius, 2022[78]). As a result, education research is often perceived to be less influential and less useful in the development of policy or in changing practice (Rycroft-Smith, 2022[79]).
Many countries have made efforts to improve the capacity to integrate evidence through the development of organisations with a specific mandate to review and curate research, and platforms that are intended to disseminate research in an accessible way. Examples of such initiatives include the Teaching and Learning Toolkit of the United Kingdom’s Educational Endowment Foundation’s “What Works” centre (EEF, n.d.[80]), the Clearinghouse for Educational Research in Denmark (DPU, 2022[81]) and the Swiss Co‑ordination Centre for Research in Education. In the United States, the Campbell Collaboration was conceived and established in 2000 as an education-focused version of the Cochrane Collaboration, which has been providing systematic reviews of heath care research since 1994. Regional Campbell centres have since been established, for the Nordic countries, United Kingdom and Ireland, and in South Asia (Campbell Collaboration, n.d.[82]).
Despite progress, not all efforts at knowledge brokerage in education have gained traction, and few of them specifically deal with the topic of digitalisation. This stands in contrast to the field of health, where national Health Technology Assessment organisations, whose mission is to review evidence and provide an assessment of the value of health technologies, are commonplace (INAHTA, n.d.[83]).
As part of the development of a national monitoring and evaluation framework, governments could thus explore ways of applying the health technology assessment approach to education technologies, through expansion of the remit of existing organisations, or the creation of new ones. For example, the Swedish Health Technology Association expanded its original remit to cover systematic reviews of social services (SBU, n.d.[84]). If resources do not permit the establishment of a permanent function, governments could consider jointly developing a function with regional partners or neighbouring countries. Governments can also consider funding systematic reviews or meta-analyses of research on a regular basis, to ensure that emerging evidence can be used to inform policy and practice. For instance, the Government of Scotland has invested in strengthening the evidence on digital education technologies by commissioning and disseminating a review of the scientific literature. The review aimed to identify the impacts that digital technology has on learning and teaching in primary and secondary schools and, more specifically, how digital technology can support and contribute to the government’s five educational priorities: Raising attainment, tackling inequalities and promoting inclusion, improving transitions into employment, enhancing parental engagement, and improving the efficiency of the education system (ICF Consulting Services Ltd, 2015[85]). The review was commissioned after an Education Scotland report had concluded that change in the use of technologies in schools “has been modest at best” and that digital technologies could have a much more significant influence on learning which motivates learners and encourages career ambitions using technologies.
Design new approaches to evidence development, drawing on emerging methodologies and commercial data sources
Another direction for policy would be to capitalise on the process of digital transformation itself to strengthen the evidence base of effective digital education. Education data mining (i.e. the application of data analytics to answer education research questions) and learning analytics (i.e. the use of data analytics to understand and improve teaching and learning) have been recognised as an emerging field of research for more than a decade (Romero and Ventura, 2013[86]), and methodology continues to improve, along with access to research datasets. The use of digital tools, including educational software in the classroom thus generates a range of potentially valuable data, providing new insights into usage patterns, how they might link to user profiles and lead to different learning outcomes.
In fact, digitalisation involves new measurement opportunities: combined with student outcomes data, the rich data generated by LMS and virtual learning environments can generate rich insights into student engagement in learning and can be used to support student success. Next to national administrative data collections and surveys of higher education students and staff, learning analytics can now serve as an additional source of evidence. For instance, data generated from widely used digital learning platforms provide unprecedented opportunities to evaluate the effectiveness of pedagogical practices. However, this potential has not frequently been exploited, owing to the need to create new networks of collaboration linking data custodians, researchers, and EdTech firms. In some countries, public authorities have begun to leverage existing, widely used digital learning platforms for rigorous education research. In the United States, for example, the Institute for Education Sciences has launched five projects linked to learning platforms, one of which, for example, is developing a plug-in to widely used LMS that enables teachers or researchers to collect informed consent, assign different versions of online learning activities to students, and export de-identified study data for analysis (Institute of Education Sciences, n.d.[87]).
Apart from complexity challenges associated with the coordination between multiple private and public stakeholders, sensitivities and caution regarding the implications for privacy and the fairness of decision processes have so far stood in the way of using learners’ “digital footprints”, at the level of individual classrooms, or at scale (Slade and Prinsloo, 2013[88]). Policy makers and education stakeholders are also increasingly aware of and responsive to the need for robust policies and regulations to protect learner privacy. For those elements of data mining and learning analytics where regulation does not yet generally exist (such as the use of algorithms), there is a growing push for an ethical approach. For example, recent OECD analysis stresses the need for humans to continue to play a key role in decision making processes with regard to engagement with at-risk students, rather than fully automating them (OECD, 2021[19]). International education organisations such as the International Council for Distance Education and the Association for the Advancement of Computing in Education have also collaborated on the development of global guidelines for ethical use of learning analytics (AACE, 2019[89]).
Other than learning analytics and education data mining, there is potential to derive insights about learner characteristics, motivations, pathways and outcomes from so-called “alternative data sources” such as citizen-generated, open-source or commercial data. Building capacity for making use of alternative data sources and developing the methodological skills needed to use them in robust evaluative processes requires resources beyond what many individual governments can allocate. Thus, in the digital era, a next‑generation monitoring and evaluation infrastructure for policy making may need to increasingly rely on partnerships with the private sector and research organisations, as well as stakeholder engagement in order to tap the potential of emerging data sources (OECD, 2019[90]). Few examples can be found to date of the systematic integration of alternative data into the monitoring and evaluation of digital education. One example is the Centre for European Policy Studies (CEPS) Index of Readiness for Digital Lifelong Learning, which was developed as a collaboration between CEPS and Grow with Google, and used data from Google searches to assess learner interest in digital education (CEPS, 2020[59]).
Potential indicators to monitor and evaluate digital education ecosystems
Digitalisation in education is a multifaceted policy issue. Effective monitoring of the penetration and impact of digitalisation must take account of the extent to which infrastructure is in place to support digitalisation, the effective use of digitalisation in teaching and learning, and the extent to which digitalisation is having a positive impact on learners. The analysis and findings of the previous chapters, along with the considerations put forward in this Chapter, lead to a proposal of key generic indicators for digital education that countries should seek to monitor using existing national or international sources of data. These indicators are applicable to all levels of education and, together, should provide national governments with a comprehensive viewpoint of the extent to which their education system is integrating and making effective use of digital technologies.
Table 9.4 shows a proposal for generic indicators that should be prioritised in the development of a national monitoring and evaluation infrastructure. Ideally, data sources to populate these indicators should permit for regular assessment of the extent to which the indicator values are equal across the system, for example across different regions of the country, or across socio-economic groupings of national importance.
Table 9.4. Proposed generic indicators for priority inclusion in national monitoring and evaluation frameworks for digitalisation in education
|
Analytical dimension |
Indicator |
Potential sources of national data |
Examples of potential international indicators |
---|---|---|---|---|
1 |
Strategic Vision for Digital Education |
Existence of a strategic vision and associated action plan (e.g. with measurable, time-bound objectives with respect to digitalisation in education) |
Policy questionnaire |
Recent publication of a new or updated policy strategy for digital education (OECD Policy Questionnaire on governance and public-private relations regarding education data and digital technology) For which elements of digitalisation do public bodies set higher education system or subsystem level targets or objectives (OECD Higher Education Policy Survey) |
2 |
Adapting pedagogical approaches, curricula and assessments |
Take-up and attitudes on the usage and impact of digital education technology |
Student, educator and education institution management surveys, school inspection or quality assurance reports |
Percentage of students who report the use of a digital device for learning or teaching during lessons in the past month, by subject (e.g. test language lessons, science, mathematics) and by type of user (teacher only, student only, student and teacher only) (PISA) |
Frequency with which students engage in a range of ICT-based learning activities during lessons (searching the internet to collect information; downloading/uploading/browsing material from school's website; sending or reading email messages; chatting online for school work; using a word processing spreadsheet or presentation programme; code/programming apps, programmes or robots; use of computers when working in groups; participating in online training programmes; learning with educational software, games, apps and quizzes) (ESSIE) |
||||
Frequency with which students use a range of ICT learning materials in lessons (exercise software, online quizzes and tests; learning applications on a smartphone or tablet; text edition tools; image edition tools; multimedia production tools; broadcasting tools; data logging tools; computer simulations; digital learning games, computer/video games) (ESSIE) |
||||
Percentage of students in schools where teachers and head-teachers hold positive opinions about whether ICT should be used for students (ESSIE) |
||||
Percentage of students whose parents hold positive attitudes towards the use of ICT at school (ESSIE) |
||||
Percentage of students who hold positive attitudes towards the use of ICT at school (ESSIE) |
||||
Potential indicators for higher education systems could be developed by adding a specific question on attitudes to digitalisation in international surveys of students and staff (e.g. Eurostudent for students or the OECD International Survey of Science for staff) |
||||
Adaptation of the curriculum to digital education |
Policy questionnaire |
Existence of rules or guidelines about specific uses of digital technology in class (e.g. as part of curriculum requirements) (OECD Policy Questionnaire on governance and public-private relations regarding education data and digital technology) |
||
3 |
Governance, guidance and regulation for digital education |
Existence of participatory mechanisms for stakeholder engagement and co‑ordination with the private sector |
Policy questionnaire |
Existence of formal processes for government engagement with EdTech companies (OECD Policy Questionnaire on governance and public-private relations regarding education data and digital technology) |
Existence of regulation or guidelines about digital security, the protection of personal data and the use of algorithmic models |
Policy questionnaire |
Existence of specific rules and guidelines about digital security and the protection of personal data (Y/N) (OECD Policy Questionnaire on governance and public-private relations regarding education data and digital technology) Existence of rules or guidelines about the use of algorithmic models in education (e.g. allowing some types of algorithms and forbidding others) (OECD Policy Questionnaire on governance and public-private relations regarding education data and digital technology) Existence of related policy levers regarding the ethical use of data and algorithms in the delivery of higher education (OECD Higher Education Policy Survey) |
||
Coverage of digital education in quality assurance procedures |
Policy questionnaire |
Integration of criteria relating to digital education in school inspection / evaluation frameworks (Eurydice) |
||
External quality assurance guidelines or methodology for higher education institutions or programmes have been revised to incorporate digital provision (OECD Higher Education Policy Survey) |
||||
4 |
Funding and procurement for digital education |
Support for institutional procurement strategies and budget practices |
Policy questionnaire |
Direct procurement of digital technologies, price negotiation with suppliers at government-level or provision of guidance to education institutions for the procurement of digital education technologies (OECD Policy Questionnaire on governance and public-private relations regarding education data and digital technology) Aspects of digitalisation that have specific public allocations been made to higher education institutions in the past five years (e.g. from targeted, special-purpose or capital funds, or provided directly by a central body (OECD Higher Education Policy Survey) |
5 |
Accessible, innovative and high-quality infrastructure for digital education |
Fast and reliable Internet connection in education institutions (average access and gaps) |
NRENs, education institution reports |
Percentage of students in schools where principals’ report that the schools’ Internet bandwidth is sufficient (PISA) |
Socio-economic, urban/rural divides in the percentage of students who report having access to an Internet connection at school (PISA) |
||||
Percentage of students who are in schools with high-speed Internet (above 100 mbps) (ESSIE) |
||||
For higher education, international indicators could potentially be constructed from the comparative data collected annually in the Géant and published in the NREN Compendium |
||||
Fast and reliable Internet connection at home (average access and gaps) |
National statistics |
Percentage of households with either 1) fast broadband (NGA), 2) Fixed Very High Capacity Network (VHCN), or 3) Fiber to the Premises (FTTP) (DESI – Digital Economy and Society Index) |
||
Student access to key technological equipment in education institutions (e.g. level and gaps) |
Education institution reports or surveys |
Percentage of principals reporting that shortages or inadequacy of digital technology for instruction (e.g. software, computers, tablets, smart boards) hinder school’s capacity to provide quality instruction (TALIS) |
||
Urban/rural or socio-economic gap in students’ access to a portable computer in schools (PISA) |
||||
Student access to key technological equipment at home (e.g. level and gaps) |
Surveys |
Socio-economic gap in students’ access to a computer they can use for school work at home (PISA) |
||
For higher education, potential indicators could be developed by adding a specific question on access to equipment in international surveys of students (e.g. Eurostudent) |
||||
Percentage of students with access to ICT devices at home (desktop computers without internet access; desktop computers with internet access; laptops, tablets, netbooks or mini notebooks without internet access; laptops, tablets, netbooks or mini notebooks with internet access; digital readers; video gaming systems; handheld game console; mobile phone without internet access; mobile phone with internet access; portable music or video player; camcorder or digital camera; wearable devices) (ESSIE) |
||||
Innovation incentives or support for digital education technologies |
Surveys |
Government subsidies of research and development to encourage EdTech innovation (e.g. specific subsidies or commissions for R&D in education technology) (OECD Policy Questionnaire on governance and public-private relations regarding education data and digital technology) Provision of monetary incentives by government authorities for the development of educational software or learning resources (OECD Policy Questionnaire on governance and public-private relations regarding education data and digital technology) Incentivising or supporting policies are available in your jurisdiction to enhance digital capabilities in higher education institutions (e.g. innovation funds, peer networks, awards and recognition for innovation) (OECD Higher Education Policy Survey) |
||
6 |
Capacity building for digital education |
Teacher's digital skills |
Self-reports in national surveys, national skills assessment |
Teachers' self-efficacy with supporting student learning through the use of digital technologies (TALIS) |
Teachers’ confidence in their digital competence in 5 competency areas (information and data literacy; communication and collaboration; digital content creation; safety; problem solving) (expressed in percentage of students) (ESSIE) |
||||
Resources provided by public authorities or publicly supported non-governmental organisations (e.g. co‑operatives, foundations, associations) to cultivate digital capabilities in staff teaching in higher education institutions (e.g. training on digital pedagogy, training on relevant software) (OECD Higher Education Policy Survey) |
||||
Teacher Professional Development Needs |
Surveys |
Percentage of teachers expressing a high need for further training on ICT skills for teaching (Need for further training) (TALIS (2018)) |
||
Parents' capacity to support their children's digital learning |
Self-reports in national surveys, national skills assessment |
Percentage of students with parents confident in teaching their children about safe and responsible Internet behaviour (scale of 4 confidence levels developed based on parents’ confidence in teaching their child how 1) to behave safely on line, e.g. prevent cyberbullying, 2) to behave safely to protect his/her privacy and 3) to manage their digital identity and reputation) (ESSIE) |
||
Students' digital skills |
Self-reports in national surveys, national skills assessment |
Students’ perceived ICT competence (PISA) |
||
Student computer and information literacy (CIL) or computational thinking (CT) achievement at 4 levels of proficiency (ICILS) Association of CIL/CT achievement with 1) gender, 2) SES background, 3) immigrant background, and 4) language background (ICILS) |
||||
7 |
Human resource policies for digital education |
Incentives and support for teachers' engagement in digital education |
Surveys, policy questionnaire |
Percentage of students in schools whose principal agreed or strongly agreed that teachers have sufficient time to prepare lessons integrating digital devices (PISA) |
How delivery of education through digital means is taken into account in workload allocation models for teaching staff (OECD Higher Education Policy Survey) |
||||
Percentage of students in schools where principals report the school has sufficient qualified technical staff (PISA) |
||||
Percentage of students in schools with incentives to reward teachers for using ICT in teaching and learning (additional training hours; reduced teaching hours; additional ICT equipment for the classroom; financial incentives; competitions and prizes; and honorary titles) (ESSIE) |
||||
8 |
Monitoring and evaluation for digital education |
Coverage of digital education in monitoring and evaluation processes |
Policy questionnaire |
Existence of rules or guidelines about the monitoring or evaluation of the effectiveness of using digital technologies in education (e.g. providing public information about evaluation results) (OECD Policy Questionnaire on governance and public-private relations regarding education data and digital technology) |
Coverage of digital education in monitoring and evaluation processes |
Percentage of schools with a policy or actions to assess the outcomes of using ICT for teaching and learning (ESSIE) |
The indicators described in Table 9.4 may be monitored using national and international data sources such as ICILS (IEA, 2022[91]), PISA (OECD, 2022[92]), TALIS (OECD, 2022[93])or GÉANT (GÉANT Association, 2022[94]). In addition, many relevant international indicators exist that provide insight on the presence of enabling factors for digitalisation in education, and that can replace or supplement national data sources.
Key messages
In light of the ambiguous research evidence on the effects of digital technologies on learning outcomes and the high costs associated with digitalising education systems, careful monitoring and evaluation of digital education policies is paramount. However, this chapter highlights significant gaps in national evidence structures regarding the implementation and effectiveness of digital education policies: Lacking information on both the investments in digital education and policy outcomes undermine governments’ capacity to assess the cost effectiveness of digital education policies. Similarly, a lacking understanding of the extent to which digital education policies have been implemented and where gaps persist limit governments’ ability to direct policy-focus to where it is most needed. To build a stronger evidence foundation for digital education policies, this chapter thus emphasises the importance of comprehensive monitoring and evaluation frameworks.
In designing monitoring and evaluation structures, governments must strike a balance between gathering relevant evidence and minimising the reporting burden for institutions and education stakeholders. Towards this end, this chapter suggests a range of existing data sources that national governments might leverage to monitor and benchmark the state of digitalisation in their education systems. These include – among others – data from international surveys or comparative policy indicators. The analysis also highlights some promising examples of countries which have adapted their evidence structures to digital education, for instance through including a digital lens to national administrative or statistical surveys or including measures of digital education in external evaluation reports of education institutions. Finally, the chapter suggests a list of possible indicators which could be used to track the progress of education systems along the analytical dimensions of this report and international data collections that countries might leverage for this purpose.
References
[89] AACE (2019), Global Guidelines: Ethics in Learning Analytics, https://www.learntechlib.org/p/208251/.
[36] Balaban, I., N. Begicevic Redjep and M. Klacmer Calopa (2018), “The Analysis of Digital Maturity of Schools in Croatia”, International Journal of Emerging Technologies in Learning (iJET), Vol. 13/06, p. 4, https://doi.org/10.3991/IJET.V13I06.7844).
[54] Bertelsmann Siftung (n.d.), Monitor Digitale Bildung, https://www.bertelsmann-stiftung.de/de/unsere-projekte/teilhabe-in-einer-digitalisierten-welt/projektthemen/projektthemen-monitor/ (accessed on 9 August 2022).
[74] Bokhove, C. and S. Sims (2020), “Demonstrating the potential of text mining for analyzing school inspection reports: a sentiment analysis of 17,000 Ofsted documents”, https://doi.org/10.1080/1743727X.2020.1819228, Vol. 44/4, pp. 433-445, https://doi.org/10.1080/1743727X.2020.1819228.
[55] Bonne, L. and J. MacDonald (2019), Secondary schools in 2018: Findings from the NZCER national survey, https://www.researchgate.net/publication/342875891_Secondary_schools_in_2018_Findings_from_the_NZCER_national_survey (accessed on 21 August 2022).
[18] Bowen, W. (2012), “The ’Cost Disease’ in Higher Education: Is Technology the Answer?”.
[16] Bulman, G. and R. Fairlie (2016), “Technology and Education: Computers, Software, and the Internet”, Handbook of the Economics of Education, Vol. 5, pp. 239-280, https://doi.org/10.1016/B978-0-444-63459-7.00005-1.
[82] Campbell Collaboration (n.d.), Better evidence for a better world -, https://www.campbellcollaboration.org/better-evidence.html# (accessed on 12 August 2022).
[17] Cellini, S. (2021), How does virtual learning impact students in higher education?, https://www.brookings.edu/blog/brown-center-chalkboard/2021/08/13/how-does-virtual-learning-impact-students-in-higher-education/.
[37] Centre for Applied Psychology at the Faculty of Philosophy in Rijeka (2018), Scientific Research on the Effects of the Project “e-Schools: Establishing a System for the Development of Digitally Mature Schools (Pilot Project)” Conclusions and recommendations, https://pilot.e-skole.hr/wp-content/uploads/2019/11/Conclusions_and_recommendations.pdf (accessed on 27 September 2022).
[59] CEPS (2020), Index of Readiness for Digital Lifelong Learning – CEPS, https://www.ceps.eu/ceps-publications/index-of-readiness-for-digital-lifelong-learning/ (accessed on 12 August 2022).
[34] Department of Education, Australia NSW (n.d.), Schools Digital Strategy - How the SDS will help you, https://education.nsw.gov.au/content/dam/main-education/en/home/about-us/strategies-and-reports/schools-digital-strategy/strategy-resources/documents/03_Schools_Digital_Strategy_How_the_SDS_will_help_you.pdf (accessed on 20 August 2022).
[42] Digital Agency et al. (2022), Roadmap on the Utilization of Data in Education, https://www.digital.go.jp/assets/contents/node/basic_page/field_ref_resources/0f321c23-517f-439e-9076-5804f0a24b59/20220307_en_education_outline_01.pdf.
[46] Direcao-general de estatisticas de educacao de ciencia (2020), Indicators, https://observatorio.incode2030.gov.pt/indicadores/ (accessed on 19 May 2023).
[81] DPU (2022), “Dansk Clearinghouse for Uddannelsesforskning”.
[71] Education Review Office New Zealand (2016), “School evaluation indicators : effective practice for improvement and learner success.”, https://ero.govt.nz/how-ero-reviews/schoolskura-english-medium/school-evaluation-indicators (accessed on 16 December 2022).
[80] EEF (n.d.), Teaching and Learning Toolkit, https://educationendowmentfoundation.org.uk/education-evidence/teaching-learning-toolkit (accessed on 12 August 2022).
[75] EQAR (n.d.), Database of External Quality Assurance Results, https://www.eqar.eu/qa-results/search/ (accessed on 1 August 2022).
[76] EQAR (n.d.), Pilot studies - EQAR, https://www.eqar.eu/about/projects/deqar-project/pilot-studies/#research-questions-2 (accessed on 12 August 2022).
[44] Estevez, E. et al. (2021), Portugal Leapfrogging Digital Transformation, CAF.
[58] European Commission (2022), Broadband Coverage in Europe in 2021 | Shaping Europe’s digital future, https://digital-strategy.ec.europa.eu/en/library/broadband-coverage-europe-2021 (accessed on 21 August 2022).
[8] European Commission (2019), 2nd Survey of Schools : ICT in Education, EC Directorate-General for Communications Networks, Content and Technology, https://doi.org/10.2759/23401.
[9] European Commission (2013), Survey of Schools: ICT in Education - Benchmarking Access, Use and Attitudes to Technology in Europe’s Schools, Directorate-General for the Information Society and Media (European Commission), Brussels, https://doi.org/10.2759/94499.
[6] European Commission/EACEA/Eurydice (2019), Digital Education at School in Europe, Eurydice Report. Luxembourg: Publications Office of the European Union, https://eacea.ec.europa.eu/national-policies/eurydice/content/digital-education-school-europe_en (accessed on 4 August 2020).
[69] European Commission/EACEA/Eurydice (2015), Assuring Quality in Education: Policies and Approaches to School Evaluation in Europe, Publications Office of the European Union, Luxembourg, https://doi.org/10.2797/65355.
[62] Eurydice (2022), National Education Systems, https://eurydice.eacea.ec.europa.eu/national-education-systems (accessed on 8 June 2021).
[10] Fraillon, J. et al. (2019), “PREPARING FOR LIFE IN A DIGITAL WORLD IEA International Computer and Information Literacy Study 2018 International Report”.
[94] GÉANT Association (2022), GÉANT COMPENDIUM, https://compendiumdatabase.geant.org/ (accessed on 30 September 2022).
[50] HESA (2020), Student 2020/21 - Location of study, https://www.hesa.ac.uk/collection/c20051/a/locsdy (accessed on 10 August 2022).
[7] Heymans, P. et al. (2018), Monitor for ICT integration in flemish education (MICTIVO): The theoretical and methodological framework.
[61] Hu’o’ng, H. (2022), “Chuyển đổi số giáo dục: Giáo viên và học sinh phải ở vị trí trung tâm - Tuổi Trẻ Online”, https://tuoitre.vn/chuyen-doi-so-giao-duc-giao-vien-va-hoc-sinh-phai-o-vi-tri-trung-tam-20221028111922459.htm (accessed on 24 January 2023).
[85] ICF Consulting Services Ltd (2015), Literature Review on the Impact of Digital Technology on Learning and Teaching, Scottish Government, Edinburgh, http://www.gov.scot/publications/literature-review-impact-digital-technology-learning-teaching/ (accessed on 16 December 2022).
[91] IEA (2022), ICILS, https://www.iea.nl/studies/iea/icils (accessed on 30 September 2022).
[83] INAHTA (n.d.), The International Network of Agencies for Health Technology Assessment, https://www.inahta.org/ (accessed on 11 August 2022).
[87] Institute of Education Sciences (n.d.), Digital Learning Platforms to Enable Efficient Education Research Network, https://ies.ed.gov/ncer/projects/program.asp?ProgID=2119 (accessed on 23 February 2023).
[38] Ireland, G. (2022), Digital Strategy for Schools to 2027, https://www.gov.ie/en/publication/69fb88-digital-strategy-for-schools/#digital-strategy-for-schools-to-2027 (accessed on 19 August 2022).
[52] Jones, C. et al. (2022), “Building on Aotearoa New Zealand’s Integrated Data Infrastructure”, Harvard Data Science Review, Vol. 4/2, https://doi.org/10.1162/99608F92.D203AE45.
[77] Köster, F., C. Shewbridge and C. Krämer (2020), Promoting Education Decision Makers’ Use of Evidence in Austria, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/0ac0181e-en.
[20] Lorenceau, L., C. Maric and T. Mostafa (2019), Upgrading the ICT questionnaire items in PISA 2021 | OECD Education Working Papers, https://www.oecd-ilibrary.org/education/upgrading-the-ict-questionnaire-items-in-pisa-2021_d0f94dc7-en (accessed on 9 August 2022).
[51] Miller, E. and J. Shedd (2019), “The History and Evolution of IPEDS”, New Directions for Institutional Research, Vol. 2019/181, pp. 47-58, https://doi.org/10.1002/IR.20297.
[53] Ministry of Education (2014), Extramural students’ participation and achievement: Trends, patterns and highlights, https://thehub.swa.govt.nz/assets/documents/41821_Extramural-Students-Report-07022014_0.pdf.
[35] Ministry of Education Italy (2022), PIANO SCUOLA 4.0, https://pnrr.istruzione.it/wp-content/uploads/2022/07/PIANO_SCUOLA_4.0_VERSIONE_GRAFICA.pdf (accessed on 27 September 2022).
[95] Mullis, I. et al. (2020), TIMMS 2019 International Results in Mathematics and Science, https://www.iea.nl/sites/default/files/2020-12/TIMSS%202019-International-Results-in-Mathematics-and-Science.pdf (accessed on 22 February 2023).
[49] National Center for Education Statistics (n.d.), https://nces.ed.gov/ipeds/use-the-data/distance-education-in-ipeds.
[15] National Forum for the Enhancement of Teaching and Learning (2016), Ireland’s Higher Education Technical Infrastructure: A Review of Current Context, with Implications for Teaching and Learning Enhancement – National Resource Hub, https://hub.teachingandlearning.ie/resource/irelands-higher-education-technical-infrastructure-a-review-of-current-context-with-implications-for-teaching-and-learning-enhancement/ (accessed on 8 August 2022).
[48] NCES (2021), Distance Education in IPEDS, https://nces.ed.gov/ipeds/use-the-data/distance-education-in-ipeds (accessed on 3 January 2023).
[96] NWO (2023), National Growth Fund | NWO, https://www.nwo.nl/en/researchprogrammes/national-growth-fund (accessed on 3 April 2023).
[39] OECD (2023), Advancing Digital Maturity in Croatia’s Higher Education System, OECD Publishing, Paris, https://doi.org/10.1787/26169177.
[4] OECD (2022), Internet access (indicator), https://doi.org/10.1787/69c2b997-en (accessed on 1 August 2022).
[11] OECD (2022), Mending the Education Divide: Getting Strong Teachers to the Schools That Need Them Most, TALIS, OECD Publishing, Paris, https://doi.org/10.1787/92b75874-en.
[92] OECD (2022), PISA, https://www.oecd.org/pisa/ (accessed on 30 September 2022).
[93] OECD (2022), TALIS - The OECD Teaching and Learning International Survey - OECD, https://www.oecd.org/education/talis/ (accessed on 30 September 2022).
[3] OECD (2022), The OECD Going Digital Measurement Roadmap | OECD Digital Economy Papers, https://www.oecd-ilibrary.org/science-and-technology/the-oecd-going-digital-measurement-roadmap_bd10100f-en (accessed on 1 August 2022).
[28] OECD (2022), “What makes students’ access to digital learning more equitable?”, Teaching in Focus, No. 43, OECD Publishing, Paris, https://doi.org/10.1787/e8107345-en.
[40] OECD (2021), Design and implementation of a comprehensive monitoring system in Austria, https://www.oecd.org/fr/education/design-and-implementation-of-a-comprehensive-monitoring-system-in-austria-956e3d9d-en.htm (accessed on 10 August 2022).
[21] OECD (2021), How to measure innovation in education?, https://www.oecd.org/education/ceri/How-to-measure-innovation-in-education.pdf (accessed on 12 August 2022).
[19] OECD (2021), OECD Digital Education Outlook 2021: Pushing the Frontiers with Artificial Intelligence, Blockchain and Robots, OECD Publishing, Paris, https://doi.org/10.1787/589b283f-en.
[27] OECD (2021), “Supporting teachers’ use of ICT in upper secondary classrooms during and after the COVID-19 pandemic”, Teaching in Focus, No. 41, OECD Publishing, Paris, https://doi.org/10.1787/5e5494ac-en.
[2] OECD (2021), Supporting the Digital Transformation of Higher Education in Hungary, Higher Education, OECD Publishing, Paris, https://doi.org/10.1787/d30ab43f-en.
[23] OECD (2021), The State of School Education: One Year into the COVID Pandemic, OECD Publishing, Paris, https://www.oecd-ilibrary.org/docserver/201dde84-en.pdf?expires=1632997342&id=id&accname=guest&checksum=AB7A563077A4309BB5207DC1BF8D8508 (accessed on 30 September 2021).
[33] OECD (2020), A roadmap toward a common framework for measuring the digital economy: Report for the G20 Digital Economy Task Force, OECD Publishing, Paris, https://www.oecd.org/digital/ieconomy/roadmap-toward-a-common-framework-for-measuring-the-digital-economy.pdf (accessed on 22 January 2021).
[41] OECD (2020), Building Capacity for Evidence-Informed Policy-Making: Lessons from Country Experiences, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/86331250-en.
[65] OECD (2020), Digital Government Index: 2019 results, https://www.oecd.org/gov/digital-government-index-4de9f5bb-en.htm (accessed on 29 March 2022).
[43] OECD (2020), Strengthening the Governance of Skills Systems: Lessons from Six OECD Countries, OECD Skills Studies, OECD Publishing, Paris, https://doi.org/10.1787/3a4bb6ea-en.
[13] OECD (2020), TALIS 2018 Results (Volume II): Teachers and School Leaders as Valued Professionals, TALIS, OECD Publishing, Paris, https://doi.org/10.1787/19cf08df-en.
[26] OECD (2020), “Teachers’ training and use of information and communications technology in the face of the COVID-19 crisis”, Teaching in Focus, No. 35, OECD Publishing, Paris, https://doi.org/10.1787/696e0661-en.
[67] OECD (2019), Benchmarking Higher Education System Performance, Higher Education, OECD Publishing, Paris, https://doi.org/10.1787/be5514d7-en.
[1] OECD (2019), “Building a monitoring and evaluation framework for open government”, in Open Government in Biscay, OECD Publishing, Paris, https://doi.org/10.1787/a70e8be3-en.
[90] OECD (2019), Measuring the Digital Transformation: A Roadmap for the Future, OECD Publishing, Paris, https://doi.org/10.1787/9789264311992-en.
[22] OECD (2019), OECD Skills Outlook 2019: Thriving in a Digital World, OECD Publishing, Paris, https://doi.org/10.1787/df80bc12-en.
[12] OECD (2019), TALIS 2018 Results (Volume I): Teachers and School Leaders as Lifelong Learners, TALIS, OECD Publishing, Paris, https://doi.org/10.1787/1d0bc92a-en.
[63] OECD (2018), Education Policy Outlook 2018: Putting Student Learning at the Centre, OECD Publishing, Paris, https://doi.org/10.1787/9789264301528-en.
[68] OECD (2015), Education at a Glance 2015: OECD Indicators, OECD Publishing, Paris, https://doi.org/10.1787/eag-2015-en.
[25] OECD (2015), “Teaching with technology”, Teaching in Focus, No. 12, OECD Publishing, Paris, https://doi.org/10.1787/5jrxnhpp6p8v-en.
[32] OECD (2013), Synergies for Better Learning - an international perspective on evaluation and assessment, https://doi.org/10.1787/9789264190658-en (accessed on 26 September 2022).
[66] OECD (2013), Synergies for Better Learning: An International Perspective on Evaluation and Assessment, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://doi.org/10.1787/9789264190658-en.
[24] OECD (2009), Creating Effective Teaching and Learning Environments: First Results from TALIS, TALIS, OECD Publishing, Paris, https://doi.org/10.1787/9789264068780-en.
[64] OECD et al. (2020), SME Policy Index: Eastern Partner Countries 2020: Assessing the Implementation of the Small Business Act for Europe, SME Policy Index, OECD Publishing, Paris/European Union, Brussels, https://doi.org/10.1787/8b45614b-en.
[45] Portugal, G. (2020), Portugal Digital: Moving forward. Moving with a Purpose - Portugal’s Action Plan for Digital Transition Contents.
[57] Portulans Institute (2021), The Network Readiness Index 2021, https://networkreadinessindex.org/country/croatia/.
[97] Rathenau Instituut (2022), “Naar hoogwaardig digitaal onderwijs”.
[30] Redecker, C. et al. (2017), “Digital Education Policies in Europe and Beyone: Key Design Principles for More Effective Policies”, https://doi.org/10.2760/462941.
[47] República Portuguesa (n.d.), Capacitação Digital das Escolas, https://digital.dge.mec.pt/ (accessed on 26 September 2022).
[86] Romero, C. and S. Ventura (2013), “Data mining in education”, Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, Vol. 3/1, pp. 12-27, https://doi.org/10.1002/WIDM.1075.
[79] Rycroft-Smith, L. (2022), “Knowledge brokering to bridge the research-practice gap in education: Where are we now?”, Review of Education, Vol. 10/1, p. e3341, https://doi.org/10.1002/REV3.3341.
[84] SBU (n.d.), About SBU, https://www.sbu.se/en/about-sbu/ (accessed on 12 August 2022).
[70] Scottish Government (2016), Enhancing Learning and Teaching Through the Use of Digital Technology: A Digital Learning and Teaching Strategy for Scotland, https://www.gov.scot/publications/enhancing-learning-teaching-through-use-digital-technology/documents/ (accessed on 12 August 2022).
[88] Slade, S. and P. Prinsloo (2013), “Learning Analytics: Ethical Issues and Dilemmas”, American Behavioral Scientist, https://doi.org/10.1177/0002764213479366.
[73] Staring, F. et al. (2022), Digital higher education: Emerging quality standards, practices and supports, https://www.oecd.org/education/digital-higher-education-f622f257-en.htm.
[14] The Campus Computing Project (2019), The 2019 Campus Computing Survey, https://www.campuscomputing.net/content/2019/10/15/the-2019-campus-computing-survey (accessed on 8 August 2022).
[78] Torres, J. and M. Steponavičius (2022), “More than just a go-between: The role of intermediaries in knowledge mobilisation OECD Education Working Paper No. 285”, OECD, Paris, http://www.oecd.org/termsandconditions. (accessed on 1 June 2023).
[56] Trucano, M. (2019), The case for a new Global Edtech Readiness Index, https://blogs.worldbank.org/edutech/new-global-edtech-readiness-index (accessed on 12 August 2022).
[5] UEN and Connected Nation (2022), 2021 Utah School Technology Inventory Report, Utah Education and Telehealth Network, Salt Lake City, https://www.uen.org/digital-learning/downloads/2021/21_UETN_Technology_Report.pdf (accessed on 12 August 2022).
[31] UNESCO (2022), “Guidelines for ICT in education policies and masterplans”, https://unesdoc.unesco.org/ark:/48223/pf0000380926 (accessed on 26 September 2022).
[29] University of Kent, SWGfL and Bitdefender (2022), “Cyber Security in UK schools How well protected are schools and colleges in England and Wales?”, https://swgfl.org.uk/assets/documents/cyber-security-report.pdf (accessed on 15 December 2022).
[72] Vaiouli, E. (2021), ASSURING QUALITY OF E-LEARNING IN VET - HUB VET4EU2, https://hub.vet4eu2.eu/blog/2021/04/27/assuring-quality-of-e-learning-in-vet/ (accessed on 22 August 2022).
[60] Venegas Marin, S. et al. (2021), Where is EdTech working? Leveraging data for better EdTech policies, https://blogs.worldbank.org/education/where-edtech-working-leveraging-data-better-edtech-policies (accessed on 21 August 2022).
Annex 9.A. International surveys/assessments with information on digital education issues
The most relevant international surveys/assessments touching on digital education issues include:
The European Survey of Schools: ICT in Education (known as ESSIE)
This survey was commissioned by the EC with the aim to benchmark progress in ICT in schools, i.e. to provide detailed and up-to-date information related to access, use and attitudes towards the use of technology in education (European Commission, 2019[8]; European Commission, 2013[9]).
ESSIE was first administered in 2011-12 and the second round was administered in 2017-18, and covered all EU member states as well as Iceland, Norway, Türkiye. The survey was carried out by a Consortium of Deloitte and IPSOS Mori.
ESSIE focuses on the primary, lower secondary and upper secondary levels of education, and consists of an online survey, and interviews with head teachers, class teachers (one teacher at ISCED level 1 [International Standard Classification of Education], three teachers at ISCED levels 2 and 3), students (all students from one randomly selected class per level in each school, except ISCED level 1), and parents.
The International Computer and Information Literacy Study (ICILS)
This large-scale assessment was initiated by the IEA with the aim to assess core aspects of students’ digital literacy, with a focus on computer literacy, information literacy and computational thinking. The study also aims to ascertain student preparedness for study, work, and a digital world, and addresses some aspects of digital citizenship (Fraillon et al., 2019[10]).
ICILS is administered every five years since 2013. The second round was administered in 2018, and the third one will take place in 2023.
The 2018 round of ICILS covered 13 countries and economies (among which 7 EU countries), while the next round will cover 33 countries and economies (among which 21 EU countries). Its target population are students at Grade 8 (average age: 13.5) and consists of an online survey administered to teachers and principals, alongside a national context questionnaire.
The distinct advantage of ICILS as a monitoring tool for the enabling factors for digital education and skills is that it directly evaluates learners’ digital skills, as a critical outcome measure to gauge the progress and success of digital strategies and action plans.
The OECD Teaching and Learning International Survey (TALIS)
This survey was commissioned by the OECD with the aim to provide robust international indicators and policy-relevant analysis on teachers and their principals and the schools they work in a timely and cost-effective manner (OECD, 2022[11]) (OECD, 2019[12]) (OECD, 2020[13]).
TALIS was first administered in 2008. The following rounds were administered in 2013 and 2018, and the fourth round will take place in 2024. It will for the first time include (optionally) a direct assessment of teachers’ pedagogical knowledge, with an emphasis on the use of digital resources and tools for teaching – hence will provide a first attempt at monitoring teachers’ capacity for digital education.
The survey development and implementation is carried out by a Consortium led by IEA.
TALIS covers 55 countries. It focuses on early childhood and care centres as well as the primary, lower secondary and upper secondary levels of education. It consists of an online survey of teachers and school principals.
The Progress in International Reading Literacy Study (PIRLS)
This large-scale assessment was initiated by the IEA with the aim to monitor system-level trends in student achievement in reading at Grade 4 in a global context, and to evaluate how well students read, interpret, and critique online information in an environment that looks and feels like the internet (ePIRLS). It also examines information technology in the classroom to better understand the classroom context.
PIRLS is administered every five years since 2001. The following rounds were administered in 2006, 2011, 2016, and the fifth round took place in 2021. Its results will be launched in May 2023.
The 2021 PIRLS round covers 27 countries and 5 benchmarking entities (including 21 EU countries). Its target population are students at Grade 4, and consists of an online survey administered to principals, teachers, students and parents alongside a national curriculum questionnaire.
The Trends in International Mathematics and Science Study (TIMSS)
This large-scale assessment was initiated by the IEA with the aim to monitor system-level trends in student achievement in mathematics and science at Grades 4 and 8 in a global context (Mullis et al., 2020[95]).
TIMSS is administered every four years since 1995. The following rounds were administered in 1999, 2003, 2007, 2011, 2015, 2019, and the eighth round will take place in 2023.
TIMSS covers 64 countries and 8 benchmarking entities. Its target population are students at Grade 4 and/or Grade 8 and it consists of an online (or paper-based) survey administered to principals, teachers, students and parents alongside a national curriculum questionnaire.
There is limited information on digital technologies in education in TIMSS, except for indicators on access to computers during mathematics and sciences lessons, teachers’ use of computers during mathematics and sciences lessons, and students’ use of computers to take mathematics and sciences tests
The Programme for International Student Assessment (PISA)
This large-scale assessment was commissioned by the OECD with the aim to assess the extent to which 15-year-old students have acquired the key knowledge and skills essential for full participation in society.
PISA was first administered in 2000. The following rounds were administered in 2003, 2006, 2009, 2012, 2015, 2018 and most recently in 2022. The results of the latest round will be published in December 2023. PISA covers 84 participating countries and economies.
More details on the technical parameters of these surveys/assessments are available from the International Large Scale Assessment gateway (https://www.ilsa-gateway.org/).
Annex 9.B. National monitoring and evaluation of high-level education digitalisation strategies by country
Annex Table 9.B.1. Monitoring and evaluation provisions of high-level education digitalisation strategies by country
Country/economy1 |
Monitoring and Evaluation Provisions for digital Education |
---|---|
AustriaC |
The initiatives under the Austrian 8-Point Plan Digital School are monitored in the course of professional project management and project control in the education ministry. Various key figures are also used, such as the number of digital devices issued to pupils. Monitoring is carried out on an ongoing basis. In addition, the use of digital devices will be evaluated in the 2022/23 school year. The focus of the evaluation will be on identifying scenarios for how schools use the devices in the classroom (in the various subjects etc.). The evaluation will not be a complete survey but will be implemented with a valid sample. |
Belgium FL |
The Flemish Community of Belgium has developed the Monitor for ICT integration in Flemish Education (MICTIVO) – a survey administered to pupils, teachers and school managers that is reoccurring every five years. The Survey covers a sample of around 20% of Flemish schools and evaluates the infrastructure and policy, perceptions, competencies and integration of ICT in primary and secondary education as well as in adult basic education (Heymans et al., 2018[7]). |
Belgium FRC |
While no detailed concept for monitoring and evaluation has been released yet, the education digitalisation strategy of the French Community of Belgium foresees the implementation of a monitoring tool to track the roll-out of the digital transition in schools with a particular focus on IT infrastructure. |
Belgium GEC |
No provisions on Monitoring and Evaluation. |
BulgariaC |
Bulgaria currently features a range of broader strategies which touch on digital education such as the National Programme Digital Bulgaria 2025 or the Digital Transformation of Bulgaria for the period 2020-2030. These strategies foresee the regular release of interim reports updating on the progress of their implementation. |
CroatiaC |
Monitoring and Evaluation is conducted as part of the Croatian E-Schools project which is currently rolled out in primary and secondary schools. Monitoring and evaluation are based on Croatia’s strategic framework for the digital maturity of schools. |
Cyprus |
n/a |
Czech RepublicC |
n/a |
DenmarkC |
In 2022, the government released a new broad digital strategy which also includes its ambitions for the digitalisation of the education sector. The implementation of the strategy will be supervised by a digitalisation council composed of experts and representatives of the public and private sector. |
EstoniaC |
Estonia’s education digitalisation strategy is implemented through 3-year programmes and is monitored annually based on a set of indicators. In addition, Estonia has piloted a low-stake test of students’ digital competence as part of quality assurance procedures. Previous evidence showed that the country also relied on schools’ self-reporting on their digital technology infrastructure, surveys of students, teachers and parents in Estonian schools, as well as an annual report developed by a specialised agency. More generally, the strength of the Estonian monitoring and evaluation system lies in the Estonian Education Information System (EHIS) as Estonia has established and maintains a digital, online and encompassing information system that brings together data on schools, pupils, teachers, exams and qualifications. |
FinlandC |
While there are no explicit monitoring and evaluation provisions in place, Finland has published a detailed description of digital and programming competencies foreseen for each age bracket as part of their New Literacies programme. These descriptions should inform the education providers to update their own plans and benchmark their students along the competence areas. |
France |
The French digital strategy for education sets out the commitment of collecting data on uses of digital technologies, training and equipment availability from actors of the digital education environment and establishing a shared indicator dashboard published by the Ministry of Education. Further, the ministry will carry out regular evaluations in co‑operation with the Department of Evaluation, Forecasting and Performance. With respect to digital skills, France has developed a reference framework for digital skills (CRCN) organised in 5 domains and 16 skills. Students are tested on those skills through the Pix Certification of Digital Skills by an external provider. |
GermanyC |
While there is no monitoring and evaluation of school digitalisation on a national level, federate state governments are responsible for reporting on the progress of Germany’s flagship school digitalisation initiative ’Digipakt’. |
Greece |
Some aspects of school digitalisation are monitored through the national information system ‘Myschool’ which is run by the Ministry of Education. School principals regularly update information regarding schools’ human resources and equipment in the system, in order to support schools accordingly. |
HungaryC |
Hungary’s digital education strategy proposes the development of a measurement-evaluation and reporting system which can serve as the basis of policy decisions. It also suggests the creation of a Digital Methodology Centre tasked with tracking the achievement of the goals of the strategy. |
Ireland |
The implementation of Ireland’s digitalisation strategy is supervised by a central stirring group. The objectives of the strategy will be further supported by an implementation plan running from 2022-2024. A midterm review will be carried out in 2025 to inform the next implementation plan. |
ItalyC |
The two Italian digitalisation projects ‘Next Generation Classrooms’ and ‘Next Generation Labs’ set out in the Italian school digitalisation strategy entail monitoring of schools in 6-monthly cycles. Implementing schools will have to upload information on their progress through an online monitoring tool. |
LatviaC |
The two Latvian guidelines both set out the monitoring of digital skill levels in the population. Further, regular evaluations of the progress of the implementation of the strategies including an interim report of the education development guidelines are planned. |
LithuaniaC |
The Lithuanian progress instrument ‘Digital Transformation of Education’ provides detailed elaborations on the targets of Lithuania’s education digitalisation, building on the general educational development plan. The document is accompanied by a list of indicators that suggest the regular monitoring of the progress of school digitalisation. |
Luxembourg |
n/a |
Malta |
n/a |
Netherlands |
While there is no system-wide strategy in place, a range of actors are involved in monitoring and evaluating different aspects of digital education in the Netherlands. For instance, the Dutch Ministry of Education is responsible for monitoring and evaluating several long-term digital education programs that are part of the National Growth Fund – a public fund that invests in projects to ensure long-term economic growth (NWO, 2023[96]). In addition, several non-governmental organisations such as the Rathenau Institute – a technology assessment organisation – have published evidence on the state of digitalisation in the Dutch education system (Rathenau Instituut, 2022[97]). |
Poland |
Poland is currently in the process of developing a new strategy for digital competences. Once in force, the implementation of the strategy will be monitored by the Digital competence Development Centre. |
PortugalC |
Portugal launched an online platform at the end of 2021 to facilitate the gathering of information on the progress of digitalisation at schools. Through this online platform, digital ambassadors submit data on key indicators regarding the implementation of digitalisation policies at their schools. |
RomaniaC |
No monitoring and evaluation provisions |
Slovak RepublicC |
The policies foreseen by the Strategy of the Digital Transformation of Slovakia is more closely specified in the corresponding action plan. The implementation of this action plan underlies annual reviews that will be submitted to the government of the Slovak Republic. |
Slovenia |
n/a |
SpainC |
In Spain, monitoring and evaluation provisions for education digitalisation are captured in the co‑operation plans #EcoDigEdu and #CompDigEdu. While the provisions set out key indicators on which data should be collected, the task of raising the data lies with the autonomous communities. |
SwedenC |
The National Agency for Education conducts follow up studies on the implementation of the Swedish Digitalisation Strategy and the achievement of its goals every three years. The most recent report released in 2022 was based on a survey aimed at teachers and school heads and focused on the digital competences of all members of the school community, equal access and use of digital resources and general potential for digitalisation in schools. |
Note:
1 As part of the data gathering process, the OECD reached out to the national officials in the Eurydice country units of all EU member states and conducted background research on their monitoring and evaluation provisions. Superscript ‘’C’’ in the country column indicates that the information displayed was obtained from national officials.
Source: Author’s elaboration