Digitalisation brings new opportunities and demands for quality monitoring in early childhood education and care (ECEC). This chapter discusses challenges for establishing robust data management and quality monitoring systems at a time when data are increasingly available and digital technologies increasingly present in ECEC settings. Building on responses to the ECEC in a Digital World policy survey (2022), the chapter examines the availability of ECEC data systems across countries and their most prevailing goals and features. It then looks at the inclusion of digitalisation-related elements in ECEC quality monitoring frameworks.
Empowering Young Children in the Digital Age
8. Data and monitoring in early childhood education and care in the digital age
Abstract
Key findings
The expansion of evidence plays a key role in informing ECEC policy and practice. Digital technologies bring opportunities to set up robust data infrastructures in the ECEC sector, with the potential to support policy design and evaluation. In turn, there are demands on quality monitoring frameworks to adapt to the gradual integration of digital technologies in a variety of processes in ECEC settings.
A large majority of the countries and jurisdictions participating in the ECEC in the Digital World policy survey (2022) have a data system in place that maintains longitudinal information about their ECEC sector and facilitates analysis and reporting for the ECEC authorities. The breadth of these data systems’ coverage tends to reflect the governance of the sector within each country or jurisdiction. There are greater challenges for data sharing and integration when responsibilities for different services or age groups are shared across multiple government agencies and/or service providers.
Supporting evaluation, accountability and management processes are “high”-priority functions for all the ECEC data systems reported on in the policy survey. Most often, data systems inform these processes by aggregating information at the country or jurisdiction level, but systems are also widely used to support monitoring and management at the setting level. Enabling research is another commonly reported purpose served by ECEC data systems.
The features and granularity of the information maintained by ECEC data systems vary across countries and jurisdictions. Unique identifiers for ECEC settings are available in almost all data systems, whereas personal identifiers for children and staff members are available in more than two-thirds and about half of the systems, respectively. Demographic information on individual children and staff as well as staff’s qualifications and experience records are also maintained by a majority of systems. The capacity to link setting-level to child-level data within their ECEC data system is reported by around half of the countries and jurisdictions, while fewer report linkages between setting-level and staff-level data.
Less than half of the participating countries and jurisdictions currently evaluate aspects related to the use of digital technologies in ECEC settings as part of their quality monitoring frameworks. The more commonly monitored aspects are ECEC professionals’ competencies for integrating digital tools into their pedagogical work with children, and in administrative and collaboration work processes. This suggests room for further aligning ECEC quality monitoring frameworks with ongoing responses to digitalisation in curriculum and pedagogy and in workforce preparation programmes.
Introduction
Data has emerged as a strategic asset to improve policy making and public service delivery across sectors, including education. However, the stakes for data misuse have also increased, and consistent data governance frameworks are needed to maximise the benefits of data while addressing related risks, both of which derive primarily from increased data openness (OECD, 2022[1]).
Data and monitoring are powerful levers to promote quality and support evidence-based policy making in ECEC. Within the Starting Strong framework, data is understood as the collection of strategic information on ECEC, while monitoring refers to the ongoing evaluation of ECEC services by systematically tracking a variety of aspects related to quality (OECD, 2012[2]; 2015[3]). Policy-oriented analysis building on the Starting Strong VI review identifies optimising the use of data and strengthening the focus of monitoring on process aspects as two major policy pointers for advancing quality assurance and improvement in the ECEC sector (OECD, 2022[4]).
Digitalisation brings new opportunities and demands regarding data use and quality monitoring in ECEC. A wealth of data is routinely collected in the ECEC sector, from demographics about children and their families to enrolment records for different services and programmes and assessments of the quality of provision and of children’s developmental pathways. Data collection on structural quality standards (e.g. group size) is an established practice and information on the profiles and conditions of the ECEC workforce (e.g. qualifications, turnover) is becoming increasingly available. However, in many countries, the lack of framework policies for data collection and management has resulted in a fragmented data architecture in ECEC systems, with multiple data silos and limited interoperability between the tools that serve to access and analyse these data. This fragmentation restricts opportunities for obtaining a comprehensive and in-depth view of the ECEC sector, as it could arise from the combination of complementary data sets covering its different aspects. However, recent improvements in digital infrastructure have greatly enhanced ECEC systems’ capacity to efficiently collect and link data about different settings and programmes. At the same time, new privacy protection regulations are being introduced which set limits on the collection and processing of personal data of young children and ECEC professionals. Hence, a major challenge for countries is strengthening their data systems to support monitoring and improvement in ECEC without compromising on the need to protect privacy.
In addition, as many ECEC systems review their curriculum and pedagogy frameworks and their workforce preparation programmes in light of digitalisation, new demands emerge for quality monitoring, for instance regarding the digital competencies of ECEC staff or the quality of the interactions that young children may have with digital tools in ECEC settings.
Exploring strategies to activate the data and monitoring policy lever, this chapter first discusses the benefits of robust ECEC data and quality monitoring systems, as well as some of the policy challenges for establishing those. Second, it explores the availability of comprehensive ECEC data systems across countries and jurisdictions having participated in the ECEC in a Digital World policy survey (2022), the purposes for which these data systems are most often used and their most prevailing features. Third, it looks at the inclusion of digitalisation-related elements in ECEC quality monitoring frameworks. It concludes with policy pointers for strengthening data management and quality monitoring in ECEC in the digital age.
Robust early childhood education and care data and monitoring systems: Benefits and policy challenges
Research developments and social changes over recent decades have elevated ECEC in policy agendas, resulting in growing levels of enrolment and increasing recognition of the value of high-quality ECEC in supporting young children’s learning, development and well-being (OECD, 2021[5]). Parallel to these developments is the expansion of evidence about ECEC programmes and its growing role in informing policy and practice. For instance, indicators on ECEC structural and process quality dimensions can contribute to increased knowledge about the level of quality provision, while information on the demographic and background characteristics of children in ECEC can be used to determine programme effects on target groups. Often gathered through monitoring systems, these data are important for gaining a solid understanding of the workings and performance of ECEC systems, which is essential not only for accountability purposes, but also for policy design and implementation, and to inform families about the quality of ECEC services. Most importantly, monitoring is key to assessing whether and how ECEC supports children’s development and well-being and what can be done to improve its quality and equity (OECD, 2018[6]).
ECEC monitoring systems and the indicators they produce vary notably across countries, reflecting the wide variety of configurations of ECEC settings and types of provision internationally. Nonetheless, past Starting Strong reviews have identified common trends in ECEC quality monitoring policies and practices, including the increasing intensity of monitoring practices; improvements in monitoring methodologies and processes; the integration of monitoring areas; alignment with primary school monitoring systems; and the increasing availability of monitoring results for the general public (OECD, 2015[3]). Common to these trends are enhanced efforts to collect and integrate an expanding range of data elements about ECEC services, and to derive relevant indicators about quality.
However, without a clear understanding of why data are needed, data collections may just respond to compliance requirements, rather than being guided by the potential of adequate indicators to help improve services. These indicators need to be determined in accordance with countries’ ECEC quality and equity frameworks and their specific institutional and socio-cultural contexts. Therefore, the scope of data collection, needs to reflect the purposes of monitoring. An important effort in this direction is to establish a robust data infrastructure that aligns with ECEC quality and equity monitoring frameworks agreed upon at the national/jurisdiction level (OECD, 2012[2]). Data systems, also known as information systems, are a particular type of general-purpose technology that facilitates data collection, storage and use. In the ECEC sector, data systems typically maintain and link a range of setting- and individual-level data elements collected at different points in time, thus potentially enabling longitudinal analysis of these data. This can include multiple types of information about children, from their socio-demographic backgrounds to their participation in ECEC, and also information about ECEC staff. Generally, data systems are also designed to facilitate data access and data use through a range of reporting and analysis tools (Data Quality Campaign, 2017[7]).
A first policy challenge towards achieving this goal relates to the fragmented data architecture that arises from the co-existence of diverse ECEC programmes and governance structures. To provide a holistic understanding of the ECEC system for policy makers, providers and other stakeholders, ECEC data systems must have the capacity to collect and link data on children, programme characteristics and workforce across multiple programmes and bodies with different responsibilities. For instance, in countries with “dual” or “split” systems where different authorities are in charge of childcare and early education, as well as in countries with decentralised monitoring and accountability procedures, data may not always enable country-wide comparisons on shared measures of high-quality ECEC that apply to all settings and children. This may happen when data collection is not sufficiently harmonised, but also when data are not shared and integrated despite adequate standardisation. A split in responsibilities for different aspects of the quality assurance process in the ECEC sector is common internationally, and agencies in charge of different monitoring arrangements often report to different departments or ministries within government. Further, in many countries and jurisdictions, a large number of small ECEC providers operate with limited resources, some of which may have difficulties coping with the demands of quality systems, including those related to data collection and processing (OECD, 2022[4]). Therefore, setting up a robust ECEC data system can be particularly challenging, but also particularly beneficial, in countries with a greater variety of ECEC programmes and governance structures.
A second challenge for enhancing the use of evidence relates to the multifaceted nature of quality in ECEC and, more tangentially, to the dearth of research about the impact that digital technologies can have on the quality of interactions in ECEC settings. Monitoring the quality of ECEC services and measuring their effectiveness at a system level is challenging (OECD, 2015[3]). Among the many requirements is the capacity to implement a strategic collection of data that maintains high standards of reliability over time and across multiple providers and programmes, and that is based on a solid understanding of the defining components of quality and on adequate measurement methodologies. To support these efforts, ECEC data systems need to integrate accurate and comprehensive inputs that relate to both structural and process quality, as well as rich contextual information, all of which can be combined to support robust analysis on quality and effectiveness. A more specific challenge concerns the collection of information about the extent and types of uses of digital technologies in ECEC settings, provided that these become approved practices. Many countries and jurisdictions have begun to adapt their ECEC curriculum and pedagogy frameworks (see Chapter 4) and their ECEC workforce development strategies (see Chapter 5) to respond to digitalisation trends, but the evidence base on the impact that digital technologies can have on the quality of ECEC is still very limited. As a result, many open questions remain with regards to the type of information that quality monitoring frameworks would need to collect about digitalisation-related aspects.
There can be many benefits to setting up robust ECEC data systems to support different aspects of broader quality systems. A major potential contribution relates to supporting accountability and improvement processes. Data systems can be instrumental in meeting demands for public accountability in the ECEC sector while also generating information on the strengths and weaknesses of specific services and of the sector as a whole. Systematic data collection and reporting can give users of ECEC services access to valuable information to help them make choices between different providers, a particularly relevant function in a sector that, in many countries, heavily relies on private providers in combination with state-run provision. A system that maintains comprehensive and reliable ECEC data is important to assist inspectorates and inform evaluations that support quality assurance. In addition, ensuring that providers also have access to a coherent package of quality indicators can be a starting point for promoting self-evaluation (OECD, 2022[4]). A recent study identified the effective use of data as a common feature of the ECEC systems of Australia, Hong Kong (China), England, Finland, Korea and Singapore. All these systems have developed a data infrastructure to systematically gather and mobilise ECEC data, using it to understand strengths and areas for improvement in their ECEC provision, generate evidence to evaluate policy impact, and inform changes in their strategies. All also face common data challenges, including confidentiality, consistency and fidelity of instruments, as well as timely and effective data integration and use (Kagan et al., 2019[8]).
Another potential benefit of data systems is to strengthen the infrastructure for research on ECEC. Central to the research value of the administrative or large-scale data sets that ECEC data systems typically maintain is making it possible to use methodologies that approximate experimental research designs and facilitate causal inferences with a strong potential to inform policy analysis and evaluation (Murnane and Willett, 2010[9]). This possibility stems from three critical features of large-scale data sets: 1) covering the entire or a very substantial share of the population of interest (large “n”), which leads to gains in statistical power and opportunities to study “rare” populations; 2) including a wide range of variables (large “k”), which allows exploring a wide range of inputs, outputs and correlates of ECEC; and 3) providing repeated, individual-level observations (large “t”), which improves opportunities to assess change over time (Saw and Schneider, 2016[10]). The use of administrative data is a growing trend in educational research, and multiple examples exist of studies drawing on such data sets to look at the effects of ECEC experiences on various life outcomes (Figlio, Karbownik and Salvanes, 2016[11]).
The research and policy potential of ECEC data systems can be further accrued through their integration with data about other levels of education or other sectors. In the United States, state- and local‑level integrated data systems are supporting policy design and evaluation in various sectors, including education, health and social services (Fantuzzo and Culhane, 2015[12]). These integrated data systems combine data from multiple government agencies, are designed to serve a general purpose rather than specific research projects and link individual-level data. This type of data infrastructure can engage stakeholders across sectors and administrative silos and facilitate the analysis of outcomes for large populations attending to a broader range of factors than it would be possible by using ECEC or education data alone. As an example of application to ECEC research and policy, an integrated data system was used in the city of Philadelphia (United States) to identify neighbourhoods with a greater share of children exposed to cumulative risks and a lower share of high-quality ECEC slots. The combination of health, education and human service data enabled the estimation of demand indicators based on multiple early risk experiences, as well as of supply indicators based on actual counts of the number of slots in preschool centres with a high-quality rating, in both cases improving the quality of previously available estimates. Policy makers used findings to inform the planning for and roll-out of expanding the city’s ECEC services (Fantuzzo et al., 2021[13]). The effective implementation of these integrated data systems is, however, complex and requires multiple supporting measures, including specific governance models and legal agreements for data sharing and privacy protection, adapted technology and security solutions, and common data standards (Culhane et al., 2017[14]).
Besides the development of data systems, responses to digitalisation within the monitoring policy lever can also include adaptations in quality monitoring frameworks to align with changes in other policy levers, in particular those of curriculum and workforce development. For instance, developmental goals in relation to children’s early digital literacy or uses of digital tools in ECEC settings may begin to be targeted by quality monitoring systems following their introduction in curricular or pedagogical frameworks. Similarly, levels of digital competencies among ECEC staff and the quality of related training opportunities may be monitored if the development of these competencies becomes an expectation or requirement for ECEC professionals.
Responses to the ECEC in the Digital World policy survey (2022) indicate that improving the integration of ECEC data systems is a policy challenge considered of “very high” or “high” importance by more than half of participant countries and jurisdictions (Figure 8.1). Data integration can serve the purpose of information sharing and co-ordination with other sectors, also supporting young children and their families (e.g. health or social services), within the ECEC sector itself (across ECEC settings and programmes, including also for children in different age groups), or with other levels of education (e.g. ISCED 1). Over 40% of countries and jurisdictions also identified the digitalisation of monitoring and assessment processes as a significant policy challenge, which suggests that the introduction of digital tools to support these processes is receiving substantial attention in ECEC systems.
Data systems in early childhood education and care
Robust data systems hold great potential to enhance quality monitoring and policy analysis in ECEC, but major challenges exist for developing such systems, as discussed in the previous section. This section explores the scope, purposes and features of current ECEC data systems across countries and jurisdictions.
Availability and scope of ECEC data systems
A large majority (79%) of the countries and jurisdictions participating in the ECEC in the Digital World policy survey (2022) report having a data system in place that maintains longitudinal records about their ECEC services and facilitates analysis and reporting for the relevant authorities. This includes 18 cases where the coverage extends to all types of ECEC settings within the country or jurisdiction and 13 cases where the coverage only applies to some types of ECEC settings (Figure 8.2).
ECEC data systems with universal or near-universal coverage of the sector are often found in countries with a strong infrastructure of population-wide administrative registers, such as the Nordic countries. In Finland, for instance, the Varda (Varhaiskasvatuksen tietovaranto) National Data Warehouse for ECEC launched in 2019 maintains nationwide information from all types of early childhood education operators, including municipalities, joint local authorities and private ECEC service providers, making it possible to automate data transfers between operators’ own data systems and Varda. The system was designed to eliminate the need for different national, regional and local authorities to maintain duplicate registers on ECEC, with expected efficiency gains in data collection and management. The Finnish National Agency for Education is responsible for the general operations of Varda and can combine its data with data in other national repositories for primary education, secondary education and tertiary levels of education. Similarly, in Norway, all registered ECEC centres (kindergartens) submit an annual electronic report to the national data system BASIL (BArnehage Statistikk Innrapporterings Løsning), a reporting platform which is the main source of official statistics about the Norwegian ECEC sector. BASIL is managed by the Norwegian Directorate for Education and Training, and Statistics Norway is responsible for linking the data from BASIL to other administrative data sets, for instance to calculate ECEC enrolment rates for different groups of children.
In other cases, ECEC data systems maintain information about specific types of settings or segments of the sector only, with responsibilities for data collection and management generally reflecting the governance models of the ECEC sector within countries and jurisdictions. For instance, in Ireland, a data system integrates information on all ECEC settings that receive public funding from the Department of Children, Equality, Disability, Integration and Youth, which is the vast majority within the country, but the few settings relying on private funding only are not covered by the system. In Canada, it is common for ECEC data systems to maintain data on school-based settings for children ages 3-5, whereas data on settings for children ages 0-5 are only integrated in some provinces and territories. For instance, Alberta’s Child Care Information System maintains data on licensed day care, preschool, and family day homes for children aged 0 to school entry, encompassing over 2 500 facilities accredited with the Alberta Child Care Licensing System. In British Columbia, the data system covers licensed childcare participating in government funding provided to children ages 0-3 or 3-5 across the jurisdiction.
Responses to the ECEC in a Digital World policy survey (2022) from several countries illustrate the challenges that the fragmentation of responsibilities within the ECEC sector poses to the development of comprehensive data systems. This can be due to different authorities being in charge of services for children in different age groups. For instance, in Portugal, data about children under age 3 are collected and maintained by the Ministry of Labour, Solidarity and Social Security, whereas data about children aged 3-5 years attending pre-primary education are managed by the Ministry of Education. Challenges may also arise when data about different aspects of ECEC provision are managed by different actors. For instance, in the Czech Republic, statistical data about ECEC settings are collected by the Ministry of Education, whereas data on structural and procedural quality of preschool education are collected by the Czech School Inspectorate. In Slovenia, the Ministry of Education is responsible for collecting the majority of records in the country’s ECEC data system, both for public and private pre-primary centres, but financial information about ECEC settings is maintained separately by the Ministry of Finance, while determining families’ eligibility for subsidies of ECEC fees further requires using data maintained by the Ministry of Labour, Family and Social Affairs. Nonetheless, data-sharing arrangements between these different agencies compensate for the lack of a system integrating different types of data.
Purposes of ECEC data systems
When asked to identify the main purposes of their ECEC data systems, about two-thirds of countries and jurisdictions participating in the survey indicated that supporting evaluation, accountability and management processes at the country/jurisdiction level were “high” priority functions for their systems, with an additional number of respondents listing those same functions as a “moderate” priority (Figure 8.3). Supporting evaluation and accountability may involve the production of statistical indicators to measure progress in relation to stated objectives for ECEC services, whereas uses to support management can include analysing data to inform staffing or other resource allocation decisions. Data systems can support these processes at different levels, depending on how the data are aggregated and the types of analysis conducted with them. While a majority of countries and jurisdictions noted that supporting evaluation, accountability and management processes within ECEC settings more specifically is also a “high” priority for their data systems, the results suggest that a stronger emphasis is placed on mobilising data for whole-of-system policy analysis and evaluation, compared to using data to support decision making at the setting level.
However, nearly half of participant countries and jurisdictions reported that helping staff and centre leaders improve responsiveness to individual child needs in ECEC settings is also a “high” priority for their data systems. Additionally, close to a third indicated that facilitating knowledge sharing and collaboration among ECEC settings and professionals is a “high” priority. The potential role of data systems as a research infrastructure is also visible in the responses to the survey, with nine in ten countries and jurisdictions listing enabling research as a “high” or “moderate” priority.
This ranking of potential goals of ECEC data systems may be seen as reflecting the evaluation and reporting approach that has traditionally guided the use of data in the education sector. Another way to interpret the ranking is by identifying the stakeholders (e.g. policy makers, settings, ECEC professionals and researchers) whose needs are served by different potential uses: this lens suggests that ECEC data systems most often remain a tool for policy makers and evaluators. However, responses to the survey suggest that an ambition to support the use of data with a greater potential to impact practices at the setting level is also present in many countries, including Australia, the Czech Republic, Hungary, Israel, Italy and Portugal.
Elements and functionalities of ECEC data systems
The capacity of data systems to support monitoring and improvement practices in ECEC depends critically on their internal architecture and the variety and granularity of the information they maintain. These features include a range of potential data elements and functionalities, and chiefly the possibility to link different types of data. Responses to the ECEC in the Digital World policy survey (2022) reveal significant variation across countries and jurisdictions in the design of their ECEC data systems (Figure 8.4).
Unique and permanent identifiers for ECEC settings are the most common feature of ECEC data systems, being available in 81% of countries and jurisdictions reporting to have such a system in place. Unique personal identifiers for children participating in ECEC are, in turn, present in 65% of the systems across countries and jurisdictions, whereas unique personal identifiers for ECEC staff members are only reported by 53% of countries and jurisdictions. These identifiers – be it at the setting or individual level – distinguish longitudinal data systems from repositories of cross-sectional data sets and are a necessary condition for linking information gathered at different points in time and thus to assess change over time. Unique identifiers are also required to sort data entities into nested structures, for instance children within settings or classrooms. Identifiers may be specific to an ECEC data system or shared with other data systems, for instance ID numbers of census or social security registries, or “unique learner numbers” that remain with individual children throughout their progress in the education system. Shared identifiers facilitate the linkage of data from different sources and can thus reduce the data collection burden, but they may also bring increased privacy risks. Setting-level identifiers are essential for supporting monitoring and evaluation efforts at the school and system levels, but individual-level permanent identifiers are also required for ECEC data systems to be able to document children’s developmental and learning trajectories, identify their needs, and sustain robust analyses of the impact of different ECEC programmes and practices. The availability of demographic data for individual children (e.g. date of birth, gender, family characteristics, special needs) is another common feature of ECEC data systems, with such elements being reported by 75% of countries and jurisdictions. Individual-level data on staff members, including both demographic characteristics and information on their qualifications and experience, is slightly less commonly available but also reported by around two-thirds of countries and jurisdictions.
Less than 40% of survey respondents indicated that financial reports and monitoring or inspection results for ECEC settings are integrated into their data systems. This may again point to governance models where responsibilities for these activities are assigned to different agencies and where limited data‑sharing agreements exist. At the individual level, data on children’s development and learning is an element available in less than 20% of the countries and jurisdictions, suggesting that assessments of children’s outcomes are not generally integrated into the evaluation and monitoring processes supported by these ECEC data systems. This might be explained by the prevalence of non-formal or non-standardised monitoring practices such as observation, documentation through portfolios or narrative assessments for children of that age.
Lastly, the capacity to link setting-level to child-level data within the data system is reported by 55% of the countries and jurisdictions, while linkages between setting-level and staff member-level data are reported by 44% of them. Where present, those linkages may bring important policy analysis and research opportunities, including at the system level. While countries may choose to focus quality monitoring and reporting at an aggregate level (e.g. setting, programme, jurisdiction), the possibility of linking setting- and individual-level data is critical to inform policies aiming to assess and foster quality in ECEC and to mitigate inequalities through ECEC.
While the ECEC in a Digital World policy survey (2022) did not specifically enquire about data on pedagogical practices and other types of interactions between children and staff in ECEC settings, the fact that only 38% of countries and jurisdictions indicated that monitoring and inspection results of ECEC settings were integrated into their data system suggests that the collection of data on process quality could be expanded. Incorporating this type of information into ECEC data systems may represent a promising avenue for advancing research and policy analysis with a focus on process quality. The LinkB5 data system in the state of Virginia, in the United States, provides an example of integrating data on the quality of teacher-child interactions measured at the classroom level (Box 8.1).
Box 8.1. Incorporating process quality data into early childhood education and care data systems
LinkB5: The data system for Virginia’s Unified Quality Birth to Five System
In 2020, the Virginia General Assembly passed legislation to establish a unified public-private system for early care and education, administered by the Virginia Department of Education. Among the key actions required from the Department of Education is to implement a new quality measurement and improvement system, called Virginia Quality Birth to Five System (VQB5), with the goals of monitoring and improving quality across all publicly funded ECEC settings for children from birth to five years-old in the state and of supporting families to choose quality options. This requires collecting consistent information about different types of programmes, including Head Start, Mixed Delivery, public schools and family day homes, to better understand quality challenges across the entire landscape of Virginia’s ECEC system.
LinkB5 is the data system for Virginia's unified measurement and improvement system. It collects information on a variety of dimensions of ECEC programmes. Information about sites includes filled and open enrolment slots, pay ranges for educators, and information about the physical spaces where children play and learn. Information about site administrators and teachers includes years of experience, educational background and language proficiency. Importantly, LinkB5 is also used to collect information related to the quality of children’s ECEC experiences down to the classroom level: the system houses systematic data about the quality of teacher-child interactions, as measured by the Classroom Assessment Scoring System assessment and collected twice a year since 2021, as well as data on curricular adoption, both at the classroom level.
Source: University of Virginia (n.d.), LinkB5 Project for Early Childhood Data Collection, https://education.virginia.edu/research-initiatives/research-centers-labs/center-advanced-study-teaching-and-learning/castl-research-projects/infant-toddler-prek-research-projects/linkb5-project-early-childhood-data-collection (accessed on 10 December 2022).
Digitalisation-related elements in early childhood education and care quality monitoring
ECEC systems are responding to digitalisation challenges in multiple ways. Many countries are reviewing their curriculum frameworks to position early digital literacy among the multiple developmental and learning goals for young children, and providing pedagogical guidance to ECEC staff on using digital tools with children in ECEC settings (see Chapter 4). Workforce development strategies are also being adapted in many of these countries to integrate demands for promoting digital competencies among staff (see Chapter 5). This section looks at the extent to which quality monitoring frameworks are beginning to cover aspects related to the use of digital technologies in ECEC.
Responses to the ECEC in a Digital World policy survey (2022) suggest that the monitoring of these aspects is not yet the norm in ECEC systems, with less than half of the participant countries and jurisdictions reporting that any of the digitalisation-related aspects listed in the questionnaire are included in their evaluations of quality in ECEC settings, as carried out by inspectors or agencies external to the settings (Figure 8.5). The most commonly monitored aspect is the competencies of staff or centre leaders in using digital technologies for pedagogical work with children, as defined by a relevant framework or quality standards (almost 40% of the countries and jurisdictions), while professional competencies for the use of digital tools in other types of work processes (e.g. administrative tasks, professional collaboration) or in communicating with and engaging with families are less often the object of evaluations (27% and 19% of countries and jurisdictions, respectively). The availability of digital infrastructure in ECEC settings is evaluated in 27% of the participating countries and jurisdictions.
The same holds for aspects where digital technologies may affect process quality more proximally. The quality of the interactions that young children may have with digital technologies in ECEC settings, as defined by a relevant framework or standards, is monitored in 35% of the participating countries and jurisdictions, whereas the amount of time they may spend interacting with digital technologies within settings, also in reference to a pre-defined framework or set of standards, is monitored in 19% of the countries and jurisdictions. Generally, monitoring is gradually being extended from structural aspects of quality (e.g. safety, class size, etc.) to process quality aspects. This trend is still not very developed in many countries (OECD, 2021[5]; 2022[4]). Results with regard to the use of digital technologies in ECEC may be seen to reflect this trend, with only a relatively small share of countries currently monitoring their potential contribution to process quality.
The Czech Republic, Israel and the Slovak Republic are the countries reporting more extensive monitoring of these aspects (all elements). In Norway, staff’s or centre leaders’ competencies in using digital technologies for pedagogical work with children are monitored in line with the requirements listed in the Framework Plan for Kindergartens, which includes guidance on kindergartens’ digital practice. In Luxembourg, the quality of interactions and the time children spend with digital tools are monitored as part of the support that ECEC teachers receive, from specialised teachers, to implement the Mediekompass reference framework for media literacy (see Case study LUX – Annex C). And in Australia, while the availability of digital infrastructure and the use of digital technologies in ECEC settings are not explicitly part of the quality monitoring in ECEC settings, they can be implicitly considered in the monitoring of areas such as educational practices, children’s health and safety, and collaborative partnerships with families and communities.
Policy pointers
Policy pointer 1: Strengthen the data infrastructure of the ECEC sector to support quality monitoring as well as policy analysis and research
Promoting data sharing across bodies with different responsibilities in the ECEC sector and setting up comprehensive data systems is crucial to bring evidence together to facilitate holistic and periodic evaluations of the ECEC system and in-depth analyses of ECEC policies and practices. Besides their own monitoring and research efforts, ECEC authorities can create conditions for external accredited researchers to access data about the ECEC sector safely and responsibly in order to carry out independent studies.
A wide range of data is generated in the ECEC sector. The different types of evidence can be reviewed with the goal of integrating data about its multiple aspects, including both structural and process quality as well as contextual information, and of supporting quality monitoring and policy analysis. Semantic standards for ECEC indicators can help different audiences make sense of the data and build trust in the consistency of the information across reports.
Ensuring strong data security and strong privacy protection, for both children and staff, and across the entire data life cycle, from collection to processing and release, is essential to promote trust in data management practices in the ECEC sector. This requires specifying desired data uses and expected benefits, identifying threats and vulnerabilities to privacy, and implementing appropriate security and privacy controls that are consistent with those uses, threats and vulnerabilities.
Policy pointer 2: Align quality monitoring frameworks with responses to digitalisation in other policy levers to ensure consistent policy strategies
The ongoing trend towards extending ECEC quality monitoring from structural to process quality dimensions can be further supported by monitoring any potential adaptations and novel targets introduced in curriculum and learning frameworks in response to digitalisation, including objectives around children’s early digital literacy and the roles expected from digital technologies in pedagogical interactions with children and in engagement with families.
Promoting digital competencies among ECEC professionals requires adequate training opportunities. ECEC systems need to monitor the quality of workforce preparation programmes that target these competencies, including both initial education and in-service training. The types and levels of digital competencies among staff can be monitored according to their specific roles and responsibilities.
Monitoring the quality of the digital infrastructure across the ECEC sector is important to ensure that all settings have adequate digital resources to meet the demands placed on them, and to identify and reduce digital divides.
References
[14] Culhane, D. et al. (2017), “Maximizing the use of integrated data systems: Understanding the challenges and advancing solutions”, The ANNALS of the American Academy of Political and Social Science, Vol. 675/1, pp. 221-239, https://doi.org/10.1177/0002716217743441.
[7] Data Quality Campaign (2017), Education Data 101: A Briefing Book for Policymakers, Data Quality Campaign, https://dataqualitycampaign.org/resource/eddata101 (accessed on 10 June 2022).
[13] Fantuzzo, J. et al. (2021), “Expansion of quality preschool in Philadelphia: Leveraging an evidence-based, integrated data system to provide actionable intelligence for policy and program planning”, Children and Youth Services Review, Vol. 127, p. 106093, https://doi.org/10.1016/j.childyouth.2021.106093.
[12] Fantuzzo, J. and D. Culhane (eds.) (2015), Actionable Intelligence, Palgrave Macmillan US, New York, NY, https://doi.org/10.1057/9781137475114.
[11] Figlio, D., K. Karbownik and K. Salvanes (2016), “Education research and administrative data”, in Handbook of the Economics of Education, Elsevier, https://doi.org/10.1016/b978-0-444-63459-7.00002-6.
[8] Kagan, S. et al. (2019), “Data to drive improvement”, in Kagan, S. (ed.), The Early Advantage: Building Systems That Work for Young Children, Volume 2, Teachers College Press, New York, NY.
[9] Murnane, R. and J. Willett (2010), Methods Matter: Improving Causal Inference in Educational and Social Science Research, Oxford University Press, Oxford.
[15] OECD (2022), ECEC in a Digital World policy survey, OECD, Paris.
[1] OECD (2022), Going Digital to Advance Data Governance for Growth and Well-being, OECD Publishing, Paris, https://doi.org/10.1787/e3d783b0-en.
[4] OECD (2022), “Quality assurance and improvement in the early education and care sector”, OECD Education Policy Perspectives, No. 55, OECD Publishing, Paris, https://doi.org/10.1787/774688bf-en.
[5] OECD (2021), Starting Strong VI: Supporting Meaningful Interactions in Early Childhood Education and Care, Starting Strong, OECD Publishing, Paris, https://doi.org/10.1787/f47a06ae-en.
[6] OECD (2018), Engaging Young Children: Lessons from Research about Quality in Early Childhood Education and Care, Starting Strong, OECD Publishing, Paris, https://doi.org/10.1787/9789264085145-en.
[3] OECD (2015), Starting Strong IV: Monitoring Quality in Early Childhood Education and Care, Starting Strong, OECD Publishing, Paris, https://doi.org/10.1787/9789264233515-en.
[2] OECD (2012), Starting Strong III: A Quality Toolbox for Early Childhood Education and Care, Starting Strong, OECD Publishing, Paris, https://doi.org/10.1787/9789264123564-en.
[10] Saw, G. and B. Schneider (2016), “Challenges and opportunities for estimating effects with large-scale education data sets”, Contemporary Educational Research Quarterly, Vol. 23/4, pp. 93-119, https://doi.org/10.6151/CERQ.2015.2304.04.