This chapter discusses Luxembourg's main strengths and challenges in developing a system for quality assurance and improvement that best support the non-formal early childhood education and care sector. It describes governance and responsibilities in monitoring quality, monitoring processes and tools, and the consequences and use of monitoring results to promote improvement, particularly on non-formal education.
Strengthening Early Childhood Education and Care in Luxembourg
3. Quality assurance and improvement in the non-formal early childhood education and care sector in Luxembourg
Abstract
Introduction
Governments worldwide have increased their investments in early childhood education and care (ECEC) to improve the accessibility and quality of ECEC services, both of which are needed for ECEC services to deliver the benefits demonstrated by rigorous research evidence (OECD, 2015[1]; 2018[2]; 2021[3]). To ensure high quality, countries are moving towards developing a highly structured, professionalised and regulated ECEC system. Data and monitoring are important drivers of quality and essential parts of quality assurance systems. They help demonstrate facts and trends, thus producing evidence on equitable access to ECEC, whether minimum standards of quality are met, whether ECEC settings offer children experiences that support their development and well-being and whether measures are in place to achieve improvements (OECD, 2015[1]; 2018[2]; 2021[3]; EC/EACEA/Eurydice, 2019[4]). Supporting such developments, research has shown that the presence and efficiency of regulation and data collection mechanisms across countries are associated with better educational outcomes for children (Pascal et al., 2013[5]).
Across countries, the levels of regulation and data collection vary. More variation has been observed, particularly for the younger age group in ECEC and home-based providers. Some countries are increasing their efforts to strengthen quality assurance mechanisms in those areas (EC/EACEA/Eurydice, 2019[4]). In addition, countries increasingly focus on developing monitoring systems that seek to enhance process quality as well as structural quality. This is the result of a consensus that structural quality aspects (those addressing space, materials, staff-to-child ratios, group sizes, levels of staff qualification and curriculum frameworks) are closely linked to the conditions for children in ECEC to experience good quality interactions (process quality), which are the main drivers of their development (Melhuish et al., 2015[6]; OECD, 2021[3]; Shuey and Kankaras, 2018[7]).
These trends can also be observed in the ECEC sector in Luxembourg. Following the introduction of the ECEC subsidy funding scheme (chèque-service accueil, CSA) in 2009 to increase the accessibility of ECEC in the non-formal sector, services have seen a rapid expansion and increased participation of young children (under age 3) in ECEC, particularly through the vast growth of a private-for-profit sector. In addition to the expansion of provision, Luxembourg also introduced further measures in the development and assurance of quality in the non-formal education sector. These measures cover all types of non-formal providers and all age groups and include a focus on structural and process quality (Achten and Bodeving, 2017[8]; Luxembourg Ministry of Education, Children and Youth, 2020[9]).
Monitoring quality and measuring effectiveness are challenging tasks that require co-ordinated and strategic processes as well as the collection of reliable data (OECD, 2015[1]). Countries employ many different approaches to ensure that monitoring results and data can inform planning, contribute to more efficient resource allocation, and lead to improved programme quality and child outcomes (OECD, 2018[2]). This chapter will discuss the strengths and challenges in Luxembourg’s current policy and practice, and current developments that are underway with a focus on non-formal education, in relation to the following areas:
governance and responsibilities in monitoring quality, considering differences across monitoring areas and purposes of monitoring
monitoring processes, including a focus on the preparation of evaluators, and the frameworks and tools in place to support monitoring
the consequences and use of monitoring results.
The chapter also makes recommendations for ongoing and future policy developments, as summarised in Box 3.1.
Box 3.1. Policy recommendations
Recommendations for the whole ECEC sector
Governance of quality assurance and monitoring
Continue to locate all monitoring activity for non-formal and formal education within the Ministry of Education, Children and Youth because a single ministry supports a coherent accountability and improvement system.
Bring together knowledge on ECEC quality across the formal and non-formal sectors to build collaboration and create connections for children and families.
Prioritise work on the centrally organised, systematic collection of information on the sufficiency of ECEC provision, characteristics of children and families participating in ECEC and the diversity of the workforce, including the language profiles of children, families and staff.
Recommendations for non-formal education
Governance and responsibilities for quality assurance
Strengthen the channels through which intelligence on quality gathered during visits from the two monitoring bodies (the Direction générale du secteur enfance and the Service national de la jeunesse) can be used to improve quality, including by reviewing funding allocations as well as creating a feedback loop on pre-service and in-service workforce training.
Improve communication of the new responsibilities of the two bodies in their roles of control versus support to enhance the work of both. Act on several fronts to enhance the capacities of regional officers to support quality improvement in the sector (e.g. better preparedness for their role, more time to focus on quality improvement and well-identified tools).
Monitoring processes and tools
Broaden the sources of information on quality available to regional officers during their visits. Engage ECEC staff, parents and children in monitoring visits (possibly mounting parent surveys in advance of the visit) to broaden and deepen the knowledge of process quality.
Develop systematic observations of staff interactions with children as well as children’s interactions with one another and introduce observational monitoring tools to assess process quality. The framework against which process quality would be assessed requires a clear focus on interactions that foster children’s development along the goals of the curriculum framework, such as experiences to foster the development of socio-emotional skills, emotional resilience, or respect for diversity.
Introduce recommendations for ECEC staff to document children’s engagement in learning experiences to aid identification of children’s needs and interests, and communicate this to parents and schools for children enrolled in formal education. Aspects that might be useful to document are child experiences that encourage the development of 21st-century skills (e.g. collaboration, creativity and self-regulation).
Offer further training for regional officers on making recommendations for improvement and supporting providers to draw up their own self-improvement plans.
Expand the selection and/or training requirements of regional officers to include specialisation in early childhood and ideally work experience in ECEC provision.
Require providers to have a comprehensive complaint management process and develop a more formal system for service users to have complaints reviewed by regional officers.
The consequences and use of monitoring results
Review the steps available to put in place when a provider consistently falls short of expected quality levels. Develop and use self-improvement plans to follow up on implementing improvement strategies and identify persistent low-quality providers. Ensure that improvement strategies are acted upon by providers, implemented in a timely manner and reviewed by regional officers.
Encourage centre leaders to share the results of external monitoring with ECEC staff members (perhaps in condensed form) and involve staff in designing improvement plans. Strengthen the capacity of the staff to undertake self-evaluation, in improvement planning in their own provision (e.g. through specific training). Ensure the results of self-evaluation are reviewed during inspections.
Consider using the monitoring results to feed into risk-based, proportionate approach to monitoring visit programming, which will help channel monitoring resources, for example, by introducing flexibility in the frequency of monitoring visits to tailor inspection according to the needs of providers.
Investigate better ways to channel information about gaps in process quality to the training institutions so that monitoring feeds directly into future in-service and pre-service training.
Introduce a requirement to publish (for example, on a parent portal or through formal communication to a structure’s “parent committee”) condensed information on the monitoring results (e.g. improvement plans and progress towards achieving goals).
Use data to analyse costs for ECEC providers in relation to subsidies and parent fees to understand if tighter mechanisms need to be developed to ensure that resources are targeted efficiently to ensure quality and equity.
Use this analysis also to clarify if there is a need to review public funding in the contracted and non-contracted sectors or if targeted funding should be allocated to certain providers based on the characteristics of participating children.
Use resulting data to identify patterns of ECEC participation among diverse families, including the use of home-based versus centre-based ECEC, enrolment in contracted versus non-contracted settings, and reasons for forgoing participation in settings for children aged 3-4 (éducation précoce).
Governance and responsibilities for quality assurance in the ECEC sector
Context
Countries employ different monitoring systems, which can be indicative of different systems of provision for different age groups and reflect levels of centralisation or decentralisation of governance and supervision (EC/EACEA/Eurydice, 2019[10]). The scope of external monitoring is often related to the type of body responsible for the external evaluation of settings (EC/EACEA/Eurydice, 2019[4]).
In Luxembourg, the provision of ECEC is split between the non-formal sector providing ECEC for children until age 3 or 4, and the formal sector providing ECEC for children over 3, and in doing so working in shared responsibility with the non-formal sector providing for children until age 12 during out-of-school time. National quality assurance arrangements for ECEC, including licensing, regulation, inspection and quality assurance, exist for all types of ECEC provision, including formal and non-formal education providers, and centre-based as well as home-based provision.
Across both the non-formal and formal sectors, the education authority has the main responsibility for governing ECEC, and separate national bodies under the Ministry of Education, Children and Youth (MENJE) are in charge of funding, regulating, organising and supervising monitoring. A split pattern between different bodies responsible for regulatory inspection and process quality can be observed in several countries in Europe. Similar to Luxembourg, separate inspectorates also exist at the national level in several countries (including Ireland, Wales [United Kingdom] and Scotland [United Kingdom]) (EC/EACEA/Eurydice, 2019[4]; OECD, 2021[3]). This split pattern can result from an introduction of process quality monitoring well after inspection processes for structural quality were established earlier.
In Luxembourg, separation of monitoring across different divisions reflects the split between the formal and non-formal sectors, as well as the historically separate introduction of structural versus process quality monitoring processes over time, with currently increasing efforts to separate inspection roles of controlling from quality improvement roles.
The provision of non-formal ECEC is regulated by the ASFT Act (Loi du 8 septembre 1998 réglant les relations entre l’Etat et les organismes oeuvrant dans les domaines social, familial et thérapeutique), which defines formal requirements for ECEC providers as well as safety regulations. Requirements concerning structural quality (child-staff ratio, group sizes, infrastructure, etc.) are regulated by a Grand-Ducal Ordonnance (Règlement grand-ducal modifié du 14 novembre 2013 concernant l’agrément à accorder aux gestionnaires de services d’éducation et d’accueil pour enfants). Since 2007, a law has been in place to regulate the provision of home-based care, with a focus on structural quality aspects such as space, and number of children in care, and the education and training of childminders (Loi du 30 novembre 2007 portant réglementation de l’activité d’assistance parentale) (Achten and Bodeving, 2017[8]). Settings must meet those standards to achieve and maintain licensing, which is statutory for all providers (SNJ, 2021[11]).
Measures for process quality are anchored in the Youth Acts from 2008 (Loi sur la jeunesse) and its modification from 2016. Until 2016, non-formal education settings had no obligations in terms of their process quality (SNJ, 2021[11]). In Luxembourg, the quality assurance system in non-formal education is linked to the ECEC subsidy funding scheme, and those settings that wish to be recognised under the subsidy funding scheme by the Ministry of Education, Children and Youth (which is nearly all providers) must meet a number of requirements. These include implementing the national curriculum – the National Reference Framework on the Non-formal Education of Children and Youth (Cadre de référence national sur l’éducation non formelle des enfants et des jeunes) and (for children aged 1‑4) implementing the multilingual education programme, meeting obligations on continuous professional development, and accepting external evaluations through regular monitoring visits.
In the non-formal sector, there is a split in quality monitoring responsibilities and functions between two divisions, both under MENJE. Agents (or inspectors) under the Department for Children (Direction générale du secteur de l’enfance) are in charge of licensing and maintaining and managing the national register, checking eligibility for the ECEC subsidy funding scheme, and monitoring for accountability purposes, with a stronger focus on structural quality aspects and compliance with regulatory standards (Figure 3.1). On the other hand, regional officers (agents régionaux) working under the National Youth Service (Service national de la jeunesse, SNJ) have a strong mission to monitor process quality and focus on quality improvement. Their monitoring body is relatively new, having been introduced in 2017.
The system with two separate bodies in charge of monitoring procedures requires communication channels between the two divisions: 1) SNJ regional officers use information from structural monitoring and inspection to check whether conditions are set for the curriculum requirements (i.e. level of qualifications for multilingual programmes); and 2) SNJ regional officers who visit settings more frequently are in communication with agents under the Department for Children and can report on other issues in addition to their core mission if weaknesses or non-compliance is observed during their monitoring process. This role is seen as important for quality improvement, inasmuch as 3) to support the development of quality standards, regional officers also collaborate with other divisions and external institutions (e.g. the SNJ’s Innovation Division, the multilingual programme team, the University of Luxembourg and the Agence Dageselteren).
In the formal ECEC sector, regional directors are responsible for the pedagogical and administrative management of schools. The country is divided into 15 regions under one minister's authority and is managed by one director, assisted by two to four deputy directors. They are in charge of pedagogical monitoring as well as the co-ordination in the management of children with special educational needs and disabilities.
Co-ordination between agencies with roles for non-formal education could be further improved
In Luxembourg, the fact that different ECEC monitoring responsibilities are under one single authority of MENJE provides a structure set up to strengthen communication and collaboration between different divisions overseeing different aspects of ECEC provision. Systematic co-operation occurs at various levels (e.g. regional officers exchange with the Department for Children on issues and questions related to specific ECEC settings).
For non-formal education, co-ordination channels exist between the SNJ Développement de la qualité division and other SNJ divisions or MENJE departments (e.g. communication with the Department for Children, with the further training co-ordination team of SNJ, with the team of the multilingual programme, or with the agency for home-based ECEC) to ensure that the results of monitoring reports lead to necessary actions that address, for example, staff training in centre-based and home-based provision.
Until recently, communication channels between regional officers from SNJ and agents of the Department for Children were mainly informal. In response to the lack of systematic exchanges at this level, new exchange procedures between the two monitoring bodies (SNJ and the Department for Children) were introduced recently, and their roles were clarified. The new model was implemented in autumn 2021. The new co-ordination efforts between SNJ and the Department for Children should be sustained. Monitoring needs to ensure that the actions taken in response to new co-ordination efforts can guide further improvements across the sector.
Monitoring bodies (SNJ and the Department for Children) in Luxembourg have good relationships with the University of Luxembourg (Université du Luxembourg) and with independent research centres. They are regularly invited to collaborate in organising national conferences on different aspects of process quality (SNJ, 2021[11]). Through regular dialogue and government-sponsored, specific research studies, those responsible for the monitoring systems review their work and make changes when appropriate. As an example of the use of external research, the ambitious multilingual programme has been the subject of external research, a practice that should continue and serves to strengthen one of Luxembourg’s flagship educational policies, which will be of interest to many other countries.
Despite these efforts, systematic co-ordination could be improved and include steps to improve data flow between the monitoring bodies. Currently, there are plans for a new digital database on ECEC, which would enable the Department for Children and SNJ regional officers to enter and access information on the results of their monitoring activities. A first step has been taken with the launch of the new version of the database, Banque de données assurance qualité, which concerns process quality. It is mainly used by SNJ and its regional officers, but it sends updates on ECEC centres that are not fulfilling their obligations to agents from the Department for Children.
A second project concerns a new database on ECEC structures and structural quality that MENJE would use. Data exchange between these two databases is being automatised to avoid duplication of work. There will be requirements for providers to enter information on this new second digital platform and upload documentation for inspectors from the Department for Children to check. The aim is to ensure that the Department for Children has access to and can check up-to-date information on settings. It is also hoped that this digital platform will improve transparency, streamline and rationalise staff activities in settings, as well as help external evaluators involved in the monitoring process.
The digital ECEC database should be used so that information on quality identified during regional officers’ visits can be linked to specific aspects of enrolment, staff training and funding. However, care will have to be taken to ensure that arrangements for entering, sharing and monitoring this information do not become too complex, nor take additional time away from staff focusing on children’s experiences. This is particularly important for small, non-formal, centre-based settings in Luxembourg, as well as home-based providers – both low in resources to deal with growing external and administrative demands and requirements.
Clarifying the roles of the two monitoring agencies for non-formal ECEC can enhance the work of both and strengthen the mechanisms to support quality improvement
Luxembourg increasingly emphasises the role of its regional officers to support quality improvement. It is striving for a clearer separation between monitoring purposes of controlling versus purposes of quality improvement, anchored in the separate responsibilities of agents from the Department for Children and regional officers from SNJ. Despite these efforts, however, there have been certain overlaps in their roles, causing some tensions, particularly in relation to the roles of regional officers, which are mainly in support but nevertheless include some purpose of control. First, regional officers inform agents under the Department for Children if non-compliance with regulations is observed during their visits. Second, they are one point of contact for complaints and forward them to responsible agents at the Department for Children where necessary (Achten and Bodeving, 2017[8]). The complaint office at the Department for Children can also be contacted directly by parents and professionals. Until the end of 2021, regional officers also checked compliance with professional development requirements (on number of hours). Recently there were dialogues between the two departments regarding the responsibilities. From the beginning of 2022, regional officers were released from their duty to check compliance with professional development requirements. It is now up to the ECEC centre managers to ensure that their staff fulfils legal requirements.
Releasing regional officers from duties to monitor compliance will help reduce some tensions. However, if the control function of regional officers is further reduced, MENJE will need to further consider the role of its agents (Department for Children) in monitoring compliance with regulations. The frequency of visits carried out by agents from the Department for Children is significantly lower than for regional officers, and mechanisms may have to be put in place to ensure the information on structural aspects of quality of provision is always up to date and regularly monitored. The introduction of the new digital database, as outlined above, might support this process.
In the short and long terms, bringing together knowledge on ECEC quality across the formal and non-formal sectors can build collaboration and create connections for children and families
The strength of the Luxembourg ECEC system is that both formal and non-formal provision are under one authority, MENJE, and therefore well placed to ensure that ECEC services in both sectors complement each other, and in combination, provide children (and families) with the range of experiences best suited to support child development and well-being. Luxembourg’s introduction of a national curriculum framework for the non-formal sector (including ECEC) aligned with the formal ECEC sector curriculum is an important step in this direction. What connects the frameworks is the definition of “fields of action” – the range of experiences to be offered to children to support their well-being and learning holistically. At the same time, there are important intended differences between the two sectors. This relates to pedagogical principals and approaches, which reflect what is specific to the provision of ECEC in each sector.
Thus, the ECEC system in Luxembourg sets itself the challenging task of ensuring connectedness while preserving each sector’s specific focus and intentions. The OECD review team heard about certain gaps in the shared understanding between the two sectors, suggesting that more could be done to bring together the knowledge and expertise on the provision of quality (including strengths and challenges) established in both the non-formal and formal sectors. Channels of communication between the two sectors should be strengthened. In particular, this should include a focus on learning and sharing about pedagogical and multilingual approaches that should facilitate planning across sectors that is in the best interests of the child (see Chapter 1).
Meeting demand and ensuring affordable access to ECEC require ongoing data collection
Participation rates have to be assessed in relation to the sufficiency of ECEC provision. Demand is higher than supply in many countries for the younger age group (children under 3), and shortages can be higher in some regions (e.g. rural areas) (OECD, 2020[12]). In combination with other barriers to participation, the shortage of places can mean that those families and children who might benefit most from ECEC participation are more likely to miss out.
Costs to parents is a main barrier to ECEC participation, particularly for low-income families. A real strength of the ECEC system in Luxembourg is unconditional free access up to 20 hours per week for all children aged 1‑4 in centre-based settings; targeted entitlements for (additional) free hours for more disadvantaged populations; and subsidised hours depending on parents’ income (see Chapter 1). Importantly, these measures apply to children from ages 1‑4 and across all types of provision apart from the free 20 hours per week for centre-based settings. Yet, ECEC in Luxembourg does not come without parental costs. These are related to hours beyond the free offer, most notably before children enter the formal schooling sector. In Luxembourg, there are no regulations regarding parental costs in the non-contracted sector, which is the main sector providing for children under the age of 3. Moreover, children enrolled in home‑based settings do not benefit from the 20 free hours, although they are covered by the subsidy funding system.
In addition, parental leave policies in Luxembourg mean that there can be a gap between when parental leave ends and the age of 1, which is when free access to ECEC starts (under the ECEC subsidy funding system). Gaps could affect low-income families disproportionally since full-time parental leave lasts four to six months, leaving ECEC costs up to parents for the remaining six to eight months. In addition, some families (in lower paid employment in particular) may have less flexibility to take on part-time parental leave entitlements to stretch parental leave to cover for more months, for example, up to age 1.
Collecting information on the uptake of home-based services and centre-based services in both the contracted and non-contracted sectors and linking this information to existing data on the fees charged by non-contracted settings would enable the government in Luxembourg to better understand the successes and limitations of the current approach to free and subsidised ECEC (see Chapter 1).
Systematic data collection on ECEC in Luxembourg is expanding and can be further strengthened
The evaluation of policies that ensure accessibility and quality of ECEC services requires availability of information on what is effective, in which context and for whom. In terms of access, crucial information includes data on participation rates in relation to the sufficiency of provision, or parental costs, linked to the types of providers and demographics of users. In Luxembourg, and regarding accessibility and quality of non-formal ECEC provision, aspects that need consideration are:
The diversity of families and children participating in ECEC. In Luxembourg, such diversity is immense in relation to children’s language backgrounds as well as parental values and expectations concerning ECEC. While language barriers, knowledge of procedures, or differences in values and beliefs can create barriers to participation (Eurofound, 2012[13]), these factors can also lead families with different background characteristics to choose or find access to different types of ECEC providers. This can cause a division between different types of providers and risk segregation across ECEC services. Segregation can impact the degree of social mix that has been found to be beneficial to children’s development, and it can challenge those providers who serve those children and families with more disadvantages or needs (de Haan et al., 2013[14]; Early et al., 2010[15]; Kuger and Kluczniok, 2008[16]).
The diversity of the workforce. With its diverse workforce, trained in different countries with different traditions (including Belgium, France and Germany), Luxembourg’s challenge is the diversity of initial training programmes completed by their workforce. Systematic collection of information on the workforce needs to include a focus on this issue. Importantly, working conditions also vary across different types of providers in Luxembourg, resulting in disparities of staff with different profiles between the private sector and the contracted sector in particular and issues with staff retention. These are all important issues to monitor.
The language profiles of users and the workforce in ECEC. The multilingual education programme is an important element of Luxembourg’s ECEC offer. Children in ECEC are to be exposed to Luxembourgish and French, and to be encouraged to express themselves in their home languages. Requirements on language competencies of the workforce are in place, but (related to the diversity of the workforce and the concentration of staff who commute across borders to work in certain regions and types of ECEC providers in Luxembourg) there are difficulties in meeting those requirements. To address language gaps and ensure that requirements can be met, systematic collection of information on children, families and staff should focus on their language profiles.
Providers outside the ECEC subsidy funding system. The quality assurance system in Luxembourg is comprehensive, and regulations that have been put in place include children under the age of 3 and home-based providers. However, the quality assurance system in Luxembourg is linked to the ECEC subsidy funding scheme. Thus, while mandatory for most centres and providers, those not relying on government funding in Luxembourg can work outside the quality assurance system. While the OECD review team heard that the percentage of centres and providers outside the subsidy funding scheme is very low, it would be important to identify those that are and introduce measures to better understand the quality of provision, resources and needs of children and families for those providers.
While Luxembourg collects general data on children’s participation rates at different ages (see, for example, EC/EACEA/Eurydice (2019[10])), there is no central systematic collection of information on those who use non-formal ECEC; the demographics of the workforce in the sector; the demand and sufficiency of provision; or working conditions. To support policy development that ensures the accessibility of services and addresses those factors that may impact the quality services can offer (e.g. retention issues, differences in the profile of the workforce across types of providers), the Ministry of Education, Children and Youth should prioritise putting into place a centrally organised, systematic collection of information on: 1) the background characteristics of families accessing different types of ECEC; 2) parental costs across different types of provision; 3) the demand and supply for ECEC places; 4) the socio-demographic profile of the workforce and their educational qualifications, further training, recruitment and retention; and their pay, working hours and work conditions. Attention will need to be paid to examining this information in relation to different types of ECEC providers and quality indicators.
The following sections of this chapter cover non-formal education only.
Monitoring processes and tools
Context
For all types of non-formal ECEC providers in Luxembourg, agents from the Department for Children are responsible for monitoring the following issues as part of the control of structural quality: staff-child ratios, staff qualifications, indoor and outdoor premises, safety regulations and health and/or hygiene regulations (the latter are additionally controlled by the Inspection du Travail et des Mines [ITM] and/or the Ministry of Health) (SNJ, 2021[11]). Following licensing, agents conduct monitoring visits every two or three years, and visits can happen unannounced.
Monitoring for process quality in Luxembourg includes a focus on pedagogy and curriculum framework implementation. Like many other countries, Luxembourg employs external evaluation practices and tools to monitor aspects of process quality. In addition, self-evaluation is a mandatory requirement for those ECEC providers under the ECEC subsidy funding scheme.
The national curriculum framework is the point of orientation for regional officers from SNJ, and the written pedagogical concept plays an important role in monitoring processes. In their monitoring role, regional officers check that pedagogical concepts and those approaches to practices planned by each setting are aligned with the curriculum framework. Regional officers visit settings one to two times per year and give settings two weeks’ written notice of their visit. Regional officers are expected to work closely with each setting assigned to them. Each regional officer has up to 40 settings to monitor. Reflection is initiated during, and following visits, and feedback is provided to leaders/managers of ECEC settings. The regional officers follow guidelines during their visits and when drafting their reports.
Monitoring quality does not have a positive impact per se; the procedures and tools for monitoring need to be aligned with intended purposes and implementation strategies (OECD, 2015[1]). With a focus on providing support for quality improvement, regional officers in charge of process quality have to collect reliable, relevant, and accurate information to help managers and leaders in ECEC settings make decisions on how best to improve their service. To collect this information, regional officers visit the setting, stay for three to four hours, and focus on three methods: the analysis of internal documents, an open dialogue with the leader of the setting and a check of the premises. During their visit, regional officers mainly interact with the leader, but if there are opportunities, they also interact (briefly) with some staff members or some of the children.
Centre-based ECEC settings have to provide the following internal documents for review by the regional officers: the pedagogical concept and the logbook (or the programme of activities for home-based providers) (Achten and Bodeving, 2017[8]; SNJ, 2021[11]). The pedagogical concept, which needs approval for a three-year period by the Ministry after an in-depth examination by SNJ, includes a pedagogical section, describing the objectives and fundamental pedagogical principles at the local or regional level, self-assessment measures, a definition of areas for which pedagogical quality assurance projects are developed and a continuing professional development plan for the staff. Settings are required by law to make their concepts public, and this is seen as a mechanism that enables parents to compare settings in terms of pedagogical quality (Achten and Bodeving, 2017[8]). The logbook (journal de bord) contains the regular (daily or weekly) written descriptions of the functions and the assignment of tasks within each setting, the work regulations of the setting, a list of daily activities with the children and an overview of the staff’s participation in continuing professional development (Achten and Bodeving, 2017[8]). Similarly, providers of home-based ECEC have to put together an establishment project, an activity report, and meet compulsory continuing training requirements (Luxembourg Ministry of Education, Children and Youth, 2020[9]).
The analysis of these documents, together with a conversation with the leader, aims to investigate pedagogical orientation and pedagogical practice and whether there is alignment with the national curriculum framework and the multilingual programme. Regional officers are trained to carry out exchanges with the leader of the setting, which focus on the curriculum frameworks’ seven areas of action, collaboration with parents and local networks, and management and staff collaboration. Their task is to address the following questions: How are the objectives implemented in daily practice? What activities or projects contribute to achieving these? What attitudes and pedagogical approaches are beneficial? To ensure the setting leader’s perspective is captured well, questions are deliberately worded as open questions (Achten and Bodeving, 2017[8]).
Regional officers also carry out a check of the premises, which mainly focuses on the quality of the physical environment: how spaces, furniture, equipment and play and learning materials are organised to facilitate experiences in all curriculum areas that nurture development and support recreational activities and rest. Children and staff are not necessarily present during this observational walk through the setting. However, if they are, regional officers also take notice of children’s engagement and the pedagogical practice they observe. Thus, the external evaluation focuses on the implementation of the curriculum – mainly evaluating the alignment with its principles and addressing the areas of action in the planning of daily interaction with children.
SNJ’s revision plans currently focus on a new framework document to provide a clearer structure for how regional officers should provide their evaluation reports. Currently, the reports are very descriptive, and the intention is to include an evaluation of the strengths and weaknesses of the quality of provision in different areas, all linked to the curriculum framework. New guidelines to better support regional officers in making judgements on quality aspects have been developed, and implementation started at the end of 2021. The guidelines include pedagogical approaches, environment and materials, staff-child interactions, interactions with parents, and the quality of the management. Indicators are being developed for each area to help regional officers in their evaluation and to better ensure coherence. The indicators have been grouped into six “quality dimensions” (quality of staff, quality of the infrastructure and the equipment, quality of the interaction with the children, quality of the pedagogical offer, quality of the relations with parents, quality of management). The aim is not to provide a quantitative judgement of the quality of a setting but to identify ways to promote quality at the setting level, as well as to understand how the sector as a whole is developing and where resources and support need to be improved.
Staff members can offer valuable perspectives during monitoring visits
Assessing process quality also refers to interactions and the overall quality of instruction and care. When monitoring visits focus on processes, they often intend to examine the relations between staff and children and staff and parents, and collaboration within the staff team. Observation is increasingly used in other countries to monitor process quality, including curriculum implementation. In many countries, observations are combined with interviews with managers and staff as well (e.g. England [United Kingdom], Ireland, Norway).
Monitoring in Luxembourg relies heavily on the analysis of internal documents. Writing these documents requires knowledge and skills that can be different from those required from staff in their daily interactions with children; moreover, internal documents are usually put together by the leader of the setting, not always in collaboration with the staff. In addition, regional officers mainly talk to leaders of settings during their visits. Thus, the voices of those interacting with children in settings are mostly unheard, and actual pedagogical practices are currently not observed. To deepen the knowledge on process quality, there is a need to broaden the sources of information on quality available to the regional officers during their visits, in including meetings and discussions with staff members. Introducing systematic observations of staff and children during everyday activities would also deepen the knowledge on process quality.
As SNJ has revised the guidelines for monitoring procedures, the next step would be to ensure that these revisions translate into changes in practices during monitoring visits with an increased focus on collecting information on interactions through observations in settings, which is under consideration for the future. If implemented, there will need to be a clear description of aims, focus, and methods of observations to guide regional officers and ensure consistent procedures across monitoring visits. With the new monitoring framework under development, there is an opportunity to focus on how interactions support children’s well‑being and development.
Observed quality indicators are needed to substantiate the degree to which pedagogical intentions found in the pedagogical concept and logbook are related to observed staff behaviours that support children’s development, including multilingual skills. For example, a focus on observing the quality of adult-child interactions could consider how adults engage in joint interactions with children and help them engage in communication and “sustained shared thinking” (Siraj, Kingston and Melhuish, 2015[17]). This includes adults not only observing and following children’s lead and allowing for child-directed play but also enhancing children’s creative play in setting up the environment and interacting in ways that provide opportunities for exploration in a context specifically designed to enhance language, development and learning. Relevant for the Luxembourg context and its multilingual education, for example, could be a focus on how adults model languages and engage children in extended exchanges and purposeful conversation, making use of a range of languages, including home languages where possible.
In continuing this new framework and the new tool for regional officers, a clear focus on interactions and on how interactions support children’s well-being and development will be important, e.g. in terms of their social and emotional experiences, their developing socio-emotional skills, self-regulation and resilience. Indices on the use of multiple languages in daily life and in communications with parents need to be included. Quality indicators need to be included that are descriptive and not evaluative; it is the overall profile of strengths and challenges that can form the basis for a self-improvement plan.
Self-evaluation can guide improvement
Self-evaluations are increasingly seen as a key dimension to ensure the quality of provision in the ECEC sector, and there is a strong international trend towards developing policies and practices. ECEC leaders and/or staff commonly employ self-evaluations to assess their centre’s level of quality (OECD, 2021[3]). Staff working directly with children are an important source of information on resources and support needed to provide good process quality. Self-evaluations can, for example, focus on collaboration in the team, communication and management, and assess what can be done to improve these aspects.
For settings under the ECEC subsidy funding system in Luxembourg, self-assessment procedures are mandatory and aim to strengthen discourse and reflection within the team, and thus lead to continuous improvement in quality (Achten and Bodeving, 2017[8]). In introducing self-evaluation into their quality assurance system, Luxembourg has recognised the value of promoting self-evaluation as one essential strategy to ensure quality. However, the results of self-evaluations are not currently used during monitoring visits, and little seems to be known about the resources and motivation of providers to engage in these processes, along with the impact these processes have on quality improvement. In response to this situation, a pilot project is currently ongoing. It tests a new self-evaluation tool developed by SNJ and made available to settings.
To ensure that self-evaluation leads to improvement, managers, leaders and practitioners in ECEC need to be provided with resources to help them build their skills and capacity to undertake effective self-evaluation and improvement planning in their own provision. External evaluators (in the case of Luxembourg, the regional officers carrying out monitoring visits) can play an important role in providing guidance on effective self-evaluation and improvement arrangements. The development of self-evaluation in ECEC settings can be led by those agencies in charge of external monitoring. Doing so will ensure that monitoring on both sides draws on the same standards and quality indicators.
Luxembourg is taking some important steps in this direction. Linked to the new framework for external evaluations and thus closely aligned with the curriculum and aiming to assess curriculum implementation, SNJ and MENJE are currently developing a framework and indicators for ECEC settings to guide their self‑evaluation. Two issues need consideration here. First, to address differences in understanding of quality and adapt for different types of provision and contexts, it will be important to include the insider view of leaders and staff in settings to guide the development of such frameworks and tools. Luxembourg is currently planning small pilot studies, and feedback from participating settings will be an important step to ensure their voices are included. Second, to encourage reflection in the team and improvement of pedagogical practices and better implementation of the curriculum, it will be important that new tools for self-reflection ensure the involvement of ECEC staff with different backgrounds and experience in self‑assessment processes.
Self-evaluation results can be part of external evaluation processes. They can be the starting point for external evaluators and provide leaders of settings, along with staff, with opportunities to share what they know about their processes and practices in relation to the key aspects of the review framework (OECD, 2021[3]). Luxembourg plans to introduce self-evaluation results as part of its external evaluation process. This is an important step and will help to better include the team's views on the quality they offer and the resources needed to improve quality. This is particularly relevant since Luxembourg's non-formal, non‑contracted ECEC sector faces structural issues, with high staff turnover, high diversity in the workforce, and less favourable working conditions (see Chapter 2). These factors can make it difficult for staff to navigate resources and hinder their capacity for quality improvement.
Moreover, external evaluation processes can also help build staff capacity for self-evaluation if constructive dialogue and feedback on self-evaluation procedures and results are part of monitoring visits. Since Luxembourg is currently developing guidelines and tools for external and internal evaluations in parallel, this is an important opportunity to ensure that tools are aligned to complement each other and support bringing together information to facilitate shared reflections and improvement.
Mechanisms and strategies are needed to address children’s diverse needs and interests
Monitoring of child development can also be part of internal evaluations, and children’s actual experiences can be an essential focus of monitoring efforts (OECD, 2021[3]). Importantly, continuous and informal monitoring at the setting level may greatly help identify learning needs for staff and children, thus improving staff practices.
OECD countries have different views and take different approaches to monitoring child development; one common approach is the use of portfolios as a record of children’s experiences and growth (OECD, 2021[3]). The measurement of child development and learning for very young children needs to be approached carefully. For diagnostic purposes and supporting additional needs, however, there is evidence on the benefits of naturalistic observations carried out on an ongoing basis (Meisels and Atkins-Burnett, 2000[18]). Luxembourg, therefore, might wish to consider introducing some mandatory documentation of individual children’s engagement in learning experiences and their growth.
Considering the complexity of language profiles of children and staff in settings, and challenges with implementing a multilingual curriculum, a focus on monitoring individual children’s multilingual learning experiences could be particularly useful. Other aspects of child learning that might be of relevance to non‑formal education could also be considered, for example documenting children’s collaboration, creativity and self-regulation. Sharing this information with parents would strengthen parent partnerships and how parents and ECEC providers can work together to respond to individual children’s interests and needs (OECD, 2015[1]).
Making information on children’s learning experiences and development available within a setting and to parents can also help identify when additional support mechanisms should be put in place. Results of such monitoring efforts could help ECEC settings receive additional resources to help address identified learning needs – for example, specially adapted equipment or specific additional services. A common approach in educational guidelines on early language learning in Europe concerns support for children who have additional needs in speech, language and communication. This is implemented by providing speech therapy or other kinds of specialist support on an individual basis or by staff receiving additional support or coaching from external support systems. For example, in Portugal, speech therapy can be provided to children who are at risk of poor outcomes and who need additional support; in Scotland, speech and language delay are identified as an area of need that entitles children to additional support; in Wales and Northern Ireland (United Kingdom), in areas that have been identified to have additional needs in speech, language and communication, programmes for children under 3 are targeted specifically at disadvantaged children (see EC/EACEA/Eurydice (2019[10])).
Parents and children should be engaged in monitoring quality processes
Surveys with parents can also be part of evaluations. They can provide parents with an opportunity to give their opinion on the level of quality and indicate their degree of satisfaction (OECD, 2015[1]). Monitoring results in Luxembourg are currently not used to address issues related to parental wishes and their level of satisfaction. Apart from regional officers receiving parent complaints to pass on, parents are not involved in the monitoring process, and internal collection of parent voices is not part of the mandatory quality assurance system. Parents, however, are valuable sources of information and can offer important perspectives on the perceived well-being of their children, their hopes and wishes for them, and concerns they may have about their children’s well-being and development.
Luxembourg might want to consider involving parents in monitoring processes. For example, settings could be required to collect feedback from parents (for example, through parent surveys), and results of these efforts could be shared with regional officers during the monitoring visit. Optional evaluation tools could be developed to support settings in collecting parent feedback. In addition, selected parents could be given the opportunity to talk to regional officers during monitoring visits, and again the development of tools could support regional officers in this task. Europe, Croatia, Estonia, and Norway, for example, have developed standardised questionnaires to support ECEC settings in involving parents in their internal evaluations. In addition, standardised questionnaires for external evaluators have been developed in Montenegro and Scotland (for children under and over 3). Standardised questionnaires for parents that have been developed often ask for feedback on the following themes: co-operation and communication with parents, safety issues, the quality of children’s learning and care and overall satisfaction. Other issues that can be addressed are: child well-being, adapting to children’s needs, supporting transition and outdoor activities (EC/EACEA/Eurydice, 2019[4]).
In a number of European countries, formal parent bodies exist in individual centres for the whole age range in ECEC (including provision for children under and over 3), and parent representatives on the formal body have the right to participate in evaluation processes. How parents participate varies between countries; they can contribute to developing internal evaluation processes or discuss and approve evaluation reports. External evaluators can be required to check whether parents have had the opportunity to contribute to the internal evaluation of settings (EC/EACEA/Eurydice, 2019[4]). Luxembourg currently has plans to introduce a mandatory parent council for non-formal education at the national level. Another means to collect parent input would be to collect feedback from the parent council in response to the summary report of monitoring results.
Similarly, the voices of children participating in ECEC are a valuable source of information. Their perceptions of their own well-being and learning should be included in monitoring processes and inform decisions on ECEC. The importance of considering the view of children in monitoring the quality of ECEC has been established, but more needs to be done to develop methods that are appropriate and valid, especially with very young children (OECD, 2021[3]). In Europe, guidelines on the involvement of children in internal or external evaluation of ECEC are relatively common for children above the age of 3 or 4 but rare for children below 3. On an internal basis, regulations can require children to be involved in planning and assessing activities in ECEC on a regular basis (e.g. Norway), or settings can be required to use tools to gather children’s views (e.g. Spain/Comunidad Valencia). Guidelines can also require external inspectors to speak with children of all ages about their views on the service and consider internal documentation of how children have been consulted (e.g. Scotland) (EC/EACEA/Eurydice, 2019[4]).
Regional officers can be further prepared and supported in their roles
To apply monitoring practices and tools with the depth of understanding needed and ensure that monitoring practices result in consistent and objective judgement, inspectors need to be trained, supported, and monitored (OECD, 2015[1]; Waterman et al., 2012[19]). Research findings suggest that training in implementing monitoring practices supports better monitoring practices, less bias in judgements and better capacity to use assessment for learning and development (OECD, 2015[1]).
In many countries, inspectors must participate in on-the-job or in-service training. Mandatory in-service training can be specified to a certain number of days per year (e.g. one to five days in Chile and Portugal or five to ten days in the Flemish Community of Belgium) (OECD, 2015[1]). In addition to in-service training, pre-service training can be required, and pre-service training can last several weeks (e.g. some Länder in Germany) or several years (e.g. England). Requirements for significant experience in early childhood development are rare (e.g. England), and so are requirements for external inspectors to have completed training in preschool education (OECD, 2021[3]; 2015[1]).
In Luxembourg, 32 regional officers have been recruited since 2017 and trained to perform monitoring visits in all non-formal settings, which are aligned with the quality framework. All regional officers have to have a master’s degree in pedagogy or equivalent but do not necessarily have to have experience or training for work with children in education or ECEC specifically. Regional officers receive two months of initial training when they begin their roles.
Regional officers are monitored by two co-ordinators based centrally. Co-ordinators have received specific training and are regularly supervised. Regional officers and co-ordinators are in close communication, and working groups and further training opportunities are organised when the need arises. In addition, guidelines exist for regional officers on how to carry out their monitoring visits and on how to evaluate a setting’s internal documentation documents (pedagogical concept, logbook).
Thus, Luxembourg has put several processes in place to ensure good quality evaluations. Nevertheless, the monitoring system is very young. During the interviews, stakeholders commented that there are challenges related to a shared understanding and vision on quality and how to arrive at sound conclusions with reference to the pedagogical and educational approaches of settings. Current efforts aim to build more consensus on several pedagogical issues, including participation, children’s rights and the roles of the pedagogues. Selection and/or training requirements of regional officers could be expanded to include direct experience in ECEC provision. Further training for regional officers will need to focus on several issues, including the (shared) view on what good quality is and how it can be assessed. Here, tools for assessing process quality (for example, videos of differing levels of quality), which are closely linked to the curriculum framework, will be helpful. In addition, and considering that each regional officer has responsibility for many settings, it needs to be ensured that the administrative burden on regional officers is managed/balanced with their capacity to conduct visits.
The consequences and use of monitoring results
Context
Reports to the OECD review team mentioned that settings struggle to meet regulatory requirements in some areas and that it can be difficult for regional officers to address quality concerns. This suggests that the weaknesses identified through monitoring are not addressed with sufficient force to drive improvement across the entire sector, with some settings, often in the non-contracted sector, struggling to deliver the sophisticated child-oriented pedagogy and the multilingual curriculum.
In Luxembourg, the main aim of monitoring carried out by regional officers is to enhance the performance of ECEC provision. The results of monitoring visits are used to provide feedback, first to settings and second to SNJ and MENJE, on what works and to identify areas for improvement. First, at the end of their visit to settings, regional officers have a conversation with the leader of the setting. Their conversation style (open dialogues) and the monitoring methods aim to encourage exchanges throughout the course of their visits, which support the aims to deliver their developmental and support roles. Following their monitoring visits, and in addition to direct verbal feedback to leaders of settings, regional officers produce a report on the results of their visit, which is shared with the person responsible at the ECEC centre. Institutions are given the opportunity to comment on their report, and reports (including comments from the institution) are then shared with MENJE and the responsible person at SNJ.
Summary reports are produced, and if issues with the quality of provision are identified across several settings, shared discussions between relevant divisions under MENJE will take place to plan actions. Efforts are made to ensure that summary reports can more easily build on reports at the centre level. Recent issues that have been identified concerned curriculum implementation or staff understanding of multilingual education. Based on shared discussion and with the involvement of other stakeholders, monitoring results have in this way contributed to the recent revision of the curriculum framework, have led to changes in professional development programmes, and have influenced the planning of monitoring procedures.
Additional mechanisms could enhance the effectiveness of the monitoring system
Regional officers oversee the follow-up process with settings. The role of regional officers in providing support and driving improvement processes means that their monitoring results are (for the most part) not directly linked to consequences for settings. Further lowering stakes, individual monitoring reports are not made available to parents or children, and monitoring results are published in a summary report only. Nevertheless, if there are issues with a setting’s compliance with legal regulations, regional officers are obliged to inform the Department for Children who can (if necessary) withdraw the ECEC subsidy funding scheme and thus the financial resources of the setting. In a case of continued breach of regulations, the setting can lose its licence and be removed from the register. However, as in other countries, and connected to demand and supply issues, non-compliance rarely leads to such high-stake consequences.
The range of options MENJE can take to ensure providers take action to address concerns may need to be revised. First, additional mechanisms to encourage settings to co-operate with regional officers could be considered to enhance the effectiveness of the monitoring system. With Singapore as a good example, some countries have introduced financial incentives for structures to design, implement, and monitor self‑improvement plans (Box 3.2).
Box 3.2. Examples of incentivising quality improvement in ECEC settings: Singapore
Singapore supports quality improvement through its Preschool Accreditation Framework (SPARK), a monitoring system that places greater emphasis on improvement than on closure or punishment. It employs fiscal incentivisation to encourage participation in a quality improvement cycle, as Singapore has realised that meaningful and dramatic quality enhancement cannot be achieved without injecting substantial resources beyond conventional operating funds.
In Singapore, all ECEC centres must be inspected by the national agency every three years in order to hold operating licences. ECEC centres can also choose to participate in the SPARK process, which entails additional monitoring. Centres use the SPARK framework as a tool for self-assessment and planning, which is a precursor to the centre applying for external quality accreditation in centres catering to 4-6 year-olds. A SPARK quality scale for younger children is underway.
Singapore has invested substantial amounts of public funds in developing, trialling, implementing and evaluating its SPARK system. After extensive consultation with stakeholders, SPARK has developed its quality rating scale that includes observation and interviews with a range of staff as well as users. The aspects of quality, especially process quality, identified in their rating scale have been validated through international research, including the link between observed quality and children’s developmental outcomes.
A significant component of SPARK is the development of a detailed improvement plan that includes leadership, planning, staff management, resources, curriculum and pedagogy. Like those in Korea, the quality inspectors in Singapore receive professional development related to observation, work to a visit template, and in the main have substantial experience of working in ECEC themselves. Their monitoring reports are constructive and thorough, containing clear steps towards improvement as well as a timetable for change. After a successful monitoring visit by national inspectors, the centre receives a “quality certificate”, which, in turn, allows them to receive additional government subsidies.
Korea has a similar approach to incentivising centres to create, implement, and monitor self‑improvement strategies, the consequences of which are monitored by external assessors. Thus, there is a close link between public accountability and public money in Singapore, as in its Asian neighbour, Korea. Each centre’s SPARK report is accessible to parents and the public via the website and is used by parents in their choice of provider, much as government inspection reports are made public in Korea, Hong Kong (China) and England. Singapore considers the public sharing of monitoring data a vital element in their national policy of self-improvement and transparency, with more than 40% of preschools SPARK-certified.
Source: Bull, R. and A. Baudista (2018[20]), “A careful balancing act: Evolving and harmonising a hybrid system of ECEC in Singapore”, in Kagan, S. (ed.), The Early Advantage Vol 1: Early Childhood Systems That Lead by Example, Teachers College, New York.
ECEC settings in Luxembourg are required by law to make their pedagogical concepts public. This is seen as a mechanism that enables parents to compare providers in terms of the pedagogical quality of a setting (Achten and Bodeving, 2017[8]). However, not all ECEC settings make their pedagogical concept public and there are currently no requirements for ECEC settings to make their monitoring reports publicly available. Luxembourg could introduce a requirement to publish (for example, on a parent portal) condensed information on the monitoring results (e.g. improvement plans and progress towards achieving goals, and if either parents or regional officers have raised issues for concern). From the side of regional officers or inspectors from the Department for Children, issues of concern could be non-compliance with regulations or settings not following up on recommendations. Such measures would help raise stakes and increase levels of transparency, especially for parents participating in the system. Ultimately, summary reports should be fully publicly available to inform parents, but there could be interim steps as tools, and the formats for reporting are currently refined.
The OECD (2021[3]) has recently highlighted the importance of considering the resources of monitoring systems needed to enhance the efficiency of monitoring processes, for example, by tailoring inspections according to the needs of providers. For example, the risk of low quality occurring in the future could be assessed during a monitoring visit, and upcoming visits for each setting could be adapted depending on the likelihood of continuing risks to quality. Such a risk-based inspection aims to ensure that the approach to inspection is proportionate and focuses efforts where they can have the greatest impact. One possibility to better channel resources is to introduce flexibility in the frequency of monitoring visits based on risk‑based analysis. Limiting the number of visits each regional officer has to undertake would enable him/her to increase the depth of his/her work, e.g. include systematic observations in their visits, increase stakeholder involvement, and strengthen planning and action towards improvement. This approach could be particularly effective if combined with thorough self-evaluation processes and a review of self-evaluation results during each monitoring visit.
Second, additional mechanisms to support ECEC settings in Luxembourg to implement changes that have been recommended could be considered to enhance the effectiveness of the monitoring system. The “additional” mechanisms might include extra funding, as is done in Singapore (Box 3.2).
Importantly, monitoring can further contribute to quality improvement if monitoring results are shared with those institutions in charge of initial education of the ECEC workforce, as training can then be adapted accordingly to meet the professional and practical needs of future staff (OECD, 2015[1]). Strengthening communication channels between providers of continuous professional development training for ECEC staff and those dealing with monitoring results at MENJE could further improve the impact of monitoring results on quality improvement.
There are currently no systematic procedures in place for centre leaders to share monitoring results with staff members. Staff, however, are the main actors in delivering good quality ECEC. For monitoring to be effective in improving practice, assessment of practice needs to link up with the objectives for reflection and improvement, and ECEC staff need to be provided with feedback and support on how to use monitoring results for their development (OECD, 2021[3]). Thus, their involvement needs to be ensured not only during monitoring visits but also in designing improvement plans in response to the results of monitoring. Time during working hours also needs to be protected for any new required tasks.
The capacity of regional officers to foster quality improvement can be strengthened
Regional officers are not responsible for coaching managers and ECEC staff, but they are responsible for supporting providers to collaborate in the development of improvement plans. During interviews, challenges were reported concerning the task of regional officers in supporting quality improvement, and to some extent, these relate to ongoing challenges of developing a common understanding on important pedagogical issues. Hand in hand with the development of a common view on quality and those practices that support it, steps will need to be taken to support regional officers in making recommendations and drawing up improvement plans. The role of co-ordinators could, for example, be further developed to ensure regional officers have a consistent approach to quality improvement as well as quality assessment.
In addition, existing overlap in the functions of regional officers to support versus control may hinder the development of relationships of trust between regional officers and providers, which allows for effective collaborative processes in the development of improvement plans. Importantly, recent developments clarified the roles of officers and agents from the two monitoring bodies (SNJ and the Department for Childhood).
Regional officers need to be trained, well guided and supported on how to provide feedback to settings that is linked to their judgements and conclusions in monitoring reports. This requires transparency to staff in settings on how recommendations are linked to monitoring results. It also requires that regional officers are familiar with the quality support system and know how to match learning needs with those types of support available to staff in settings. Follow-up of improvement plans needs to be ensured, so that strategies are implemented in a timely manner, and evidence of effectiveness is collected and reviewed by regional officers. In response to these training needs, it is envisaged to make “job shadowing” during monitoring visits to centre-based and home-based providers part of the training for regional officers.
In addition, adopting a risk-based approach to inspection and limiting the number of visits by each regional officer would free up time to focus on developing quality improvement measures in consultation with ECEC settings. As a reform of professional development is ongoing, it will be important to better involve regional officers in professional development plans developed by settings (see Chapter 2). This reform provides opportunities to develop coaching and centre-embedded training that can be encouraged by regional officers to meet the needs for quality improvement of some settings.
Policies to improve performance in the ECEC sector can be more systematically informed by data on ECEC providers, children and the workforce
Planning of policies that ensure accessibility and quality of ECEC services requires analysis of performance in the sector that is based on and brings together information of its providers, workforce, and children and families. The linking of this quantitative data, together with outcomes of monitoring processes, can provide valuable evidence on the strengths and weaknesses of the ECEC system, which helps to identify gaps that need to be addressed, and thus plays a key role in guiding policy development.
In the context of Luxembourg, it will be particularly important to link findings on ECEC process and structural quality with analysis that assesses the diversity of participating families in relation to access and patterns of ECEC participation, and the diversity of the workforce, in relation to resources and working conditions in different types of provision. Research has shown that quality varies between different types of providers (e.g. Mathers, Sylva and Joshi (2007[21]); Slot et al. (2015[22])). The OECD review team has heard that this is also true in the context of Luxembourg, with the non-contracted sector and smaller centres, in particular, struggling to provide good quality ECEC and to comply with some regulations – for example, those related to staff language skills, or hours of professional development (see Chapter 2).
To understand if there is a need for an increase in public funding in the commercial (non-contracted) sector, or if tighter mechanisms need to be developed to ensure that resources are used efficiently to provide good quality, analysis of data on costs for ECEC providers is needed in relation to their receipt of demand‑side subsidies (through the ECEC subsidy funding scheme) and their additional income through parent fees. Such analysis could help clarify if certain types of providers need additional supply-side funding, for example, those serving a higher percentage of children at risk (e.g. low-income families, children with additional educational needs or disabilities).
Bringing together data on each provider will also help monitoring bodies target their resources more effectively. A digital platform is currently being developed in Luxembourg with the aim to assist in centralising the collection of data, which should better enable the sharing and effective use of information. The aim is to integrate, for example, data collected through licensing processes and structural quality monitoring, as well as results from monitoring, focusing on process quality aspects. The integration of further quantitative data on enrolment, staffing and users of ECEC in this single database should also be considered. The design of a portal that integrates the infrastructure to receive public funding (e.g. subsidies through the ECEC subsidy funding scheme or funds for professional development) with the collection of information required for monitoring will reduce the administrative burden for providers while at the same ensuring data is provided, and can be linked to enable analysis that informs policies.
Steps will need to be taken to plan who contributes to the collection of this data, which resources are needed at the level of the centre or provider to ensure information is kept up to date, how arrangements of data sharing between relevant divisions can be set up, and who oversees analysis of this information to assess the performance of the sector and guide policy development.
A forensic review of early childhood services in six high performing countries (Australia, Hong Kong [China], Finland, Korea, Singapore, England [United Kingdom]) (Kagan et al., 2019[23]) identified several building blocks of effective systems, among which is included “data to drive improvement”. This building block focused on strategies within each country to “advance knowledge, evaluate program effectiveness, test innovations, fuel strategic planning, and inform policy reforms”. Deliberate and intentional processes in each of the successful countries to collect, synthesise and then use data on both children and services drive improvements across the system. Luxembourg’s use of this building block, “data to drive improvement”, needs review and strengthening to match the ambitious curricular aims of the country and its substantial financial investment.
References
[8] Achten, M. and C. Bodeving (2017), “Development of quality in the non-formal education sector in Luxembourg”, in Klinkhammer, N. et al. (eds.), Monitoring Quality in Early Childhood Education and Care: Approaches and Experience from Selected Countries, German Youth Institute, Department of Children and Childcare, Munich.
[20] Bull, R. and A. Bautista (2018), A careful balancing act: Evolving and harmonising a hybrid system of ECEC in Singapore, Teachers College, New York.
[14] de Haan, A. et al. (2013), “Targeted versus mixed preschools and kindergartens: Effects of class composition and teacher-managed activities on disadvantaged children’s emergent academic skills”, School Effectiveness and School Improvement, Vol. 24/2, pp. 177-194, https://doi.org/10.1080/09243453.2012.749792.
[15] Early, D. et al. (2010), “How do pre-kindergartners spend their time?, Gender, ethnicity, and income as predictors of experiences in pre-kindergarten classrooms”, Early Childhood Research Quarterly, Vol. 25, pp. 177–193, http://10.1016/j.ecresq.2009.10.003.
[4] EC/EACEA/Eurydice (2019), Eurydice Brief: Key Data on Early Childhood Education and Care in Europe, Publications Office of the European Union, Luxembourg, https://eacea.ec.europa.eu/national-policies/eurydice/content/eurydice-brief-key-data-early-childhood-education-and-care-europe_en (accessed on 6 October 2021).
[10] EC/EACEA/Eurydice (2019), Eurydice Report: Key Data on Early Childhood Education and Care in Europe – 2019 Edition, Publications Office of the European Union, Luxembourg, https://eacea.ec.europa.eu/national-policies/eurydice/content/key-data-early-childhood-education-and-care-europe-–-2019-edition_en.
[13] Eurofound (2012), Quality of Life in Europe: Impacts of the Crisis, Publications Office of the European Union, Luxembourg, https://www.eurofound.europa.eu/publications/report/2012/quality-of-life-social-policies/quality-of-life-in-europe-impacts-of-the-crisis.
[23] Kagan, S. et al. (2019), “Data to drive improvement”, in Kagan, S. (ed.), The Early Advantage: Building Systems That Work for Young Children, Volume 2, Teachers College Press, New York.
[16] Kuger, S. and K. Kluczniok (2008), “Prozessqualität im Kindergarten, Konzept, Umsetzung und Befunde, Zeitschrift für Erziehungswissenschaft”, Sonderheft, Vol. 11, pp. 159–178, https://doi.org/10.1007/978-3-531-91452-7_11.
[9] Luxembourg Ministry of Education, Children and Youth (2020), The Luxembourg Education System, https://men.public.lu/dam-assets/catalogue-publications/divers/informations-generales/the-luxembourg-education-system-en.pdf.
[21] Mathers, S., K. Sylva and H. Joshi (2007), Quality of Childcare Settings in the Millenium Cohort Study, Research Report SSU/2007/FR/025, Department for Education and Skill, Nottingham, https://dera.ioe.ac.uk/8088/.
[18] Meisels, S. and S. Atkins-Burnett (2000), “The elements of early childhood assessment”, in Shonkoff, J. and S. Meisels (eds.), Handbook of Early Childhood Intervention, Cambridge University Press, https://doi.org/10.1017/CBO9780511529320.013.
[6] Melhuish, E. et al. (2015), “A review of research on the effects of early childhood education and care (ECEC) upon child development”, CARE project, https://ecec-care.org/fileadmin/careproject/Publications/reports/CARE_WP4_D4__1_review_of_effects_of_ecec.pdf.
[3] OECD (2021), Starting Strong VI: Supporting Meaningful Interactions in Early Childhood Education and Care, Starting Strong, OECD Publishing, Paris, https://dx.doi.org/10.1787/f47a06ae-en.
[12] OECD (2020), Quality Early Childhood Education and Care for Children Under Age 3: Results from the Starting Strong Survey 2018, TALIS, OECD Publishing, Paris, https://doi.org/10.1787/99f8bc95-en.
[2] OECD (2018), Engaging Young Children: Lessons from Research about Quality in Early Childhood Education and Care, Starting Strong, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264085145-en.
[1] OECD (2015), Starting Strong IV: Monitoring Quality in Early Childhood Education and Care, Starting Strong, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264233515-en.
[5] Pascal, C. et al. (2013), A Comparison of International Childcare Systems: Research Report, Centre for Research in Early Childhood (CREC), Birmingham, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/212564/DFE-RR269.pdf.
[7] Shuey, E. and M. Kankaras (2018), “The Power and Promise of Early Learning”, OECD Education Working Papers, No. 186, OECD Publishing, Paris, http://dx.doi.org/10.1787/f9b2e53f-en.
[17] Siraj, I., D. Kingston and E. Melhuish (2015), Assessing Quality in Early Childhood Education and Care: Sustained Shared Thinking and Emotional Well-being (SSTEW) Scale for 2–5-year-olds Provision, Trentham Books, London.
[22] Slot, P., M. Lerkkanen and P. Leseman (2015), “The relations between structural quality and process quality in European early childhood education and care provisions: Secondary analyses of large scale studies in five countries”, CARE Project.
[11] SNJ (2021), Quality Beyond Regulations in Early Childhood Education and Care (ECEC): Country Background Report of Luxembourg, Service national de la jeunesse.
[19] Waterman, C. et al. (2012), “The matter of assessor variance in early childhood education: Or whose score is it anyway?”, Early Childhood Research Quarterly, Vol. 27/1, pp. 46-54, http://10.1016/j.ecresq.2011.06.003.