This chapter is about monitoring and evaluating the Irish education system, particularly the DEIS programme. It analyses how outcomes are monitored and evaluated at the system and school levels. Ireland has developed a strong expertise in monitoring and evaluation in regard to the DEIS programme. Moreover, the system emphasises the role of self‑evaluation, and the Inspectorate serves a vital role in school evaluation more generally. However, challenges remain concerning the limited use of granular and combined administrative data, the absence of a control group and causal implications in DEIS evaluations, and insufficient capacity for data‑informed improvement planning in DEIS schools. The chapter provides recommendations to overcome these challenges, and strengthen the monitoring and evaluation efforts of the DEIS programme and the education system more broadly.
OECD Review of Resourcing Schools to Address Educational Disadvantage in Ireland
6. Monitoring and evaluation to address educational disadvantage
Abstract
Contextual background
Monitoring and evaluation of outcomes at the system level
Monitoring and evaluation are essential to assess progress in improving education outcomes. Monitoring refers to the systematic collection of data to assess the progress and achievement of policy objectives against set targets, and to identify and lift implementation bottlenecks (OECD, 2024[1]). Evaluation refers to judgements on the effectiveness of schools, school systems, policies and programmes (OECD, 2013[2]). Monitoring and evaluation are crucial in providing feedback to inform improvements across the education system and identifying necessary school support measures (OECD, 2023[3]). Without relevant monitoring of data, policy makers might evaluate policies and practices according to the imperfect information they have available. This might misdirect them or, in the case of the absence of data, may mean that they are unaware of challenges that need action (ibid.).
Monitoring and evaluation efforts are often summarised in strategic documents. Ireland's main strategic document that monitors education inputs, processes and outcomes is the Statement of Strategy, which runs from 2023 to 2025 (Department of Education, 2023[4]). It outlines the vision, mission, values, goals and actions of the Department of Education (DoE) for the three years (Chapter 2). It reflects the challenges and opportunities in the education sector, such as the impact of the COVID-19 pandemic, the arrival of Ukrainian students, the need for digital and climate action, the enrolment projections and trends, a growing recognition of the importance of the personal well‑being of children and young people, and international and cross‑governmental commitments and obligations. In terms of monitoring and evaluation, the Statement sets out four strategic goals and several strategic actions for each goal. The goals are to:
Enable the provision of high-quality education and improve the learning experience to meet the needs of all children and young people, in schools and early learning and care settings;
Ensure equity of opportunity in education and that all children and young people are supported to fulfil their potential;
Provide strategic leadership and support for the delivery of the right systems and infrastructure for the sector; and
Organisational excellence and innovation.
The actions cover various aspects of the education system, such as curriculum and assessment, teacher supply and professional learning, special education and educational disadvantage, school infrastructure and transport, digital and climate action, quality assurance and evaluation, Irish language and Gaeltacht education, and stakeholder engagement and communication. Statements of Strategy are monitored through the DoE Annual Reports (Department of Education, 2022[5]). The Annual Reports summarise actions taken for each goal outlined in the Statement.
Monitoring of the Delivering Equality of Opportunity In Schools (DEIS) Plan is achieved primarily through targets set in the “National Strategy: Literacy and Numeracy for Learning and Life 2011-2020” (Department of Education and Skills, 2017[6]). The strategy was developed around six pillars:
Enabling parents and communities to support children’s literacy and numeracy development;
Improving teachers’ and early childhood care and education practitioners’ professional practice;
Building the capacity of school leadership;
Improving the curriculum and the learning experience;
Helping students with additional learning needs to achieve their potential; and
Improving assessment and evaluation to support better learning in literacy and numeracy.
The strategy outlines targets for the education system (Department of Education and Skills, 2017[6]). It also sets out implementation plans with particular actions around each pillar and indicative dates for adopting the actions to achieve the targets (ibid.). At the primary level, the targets are based on achievement in the National Assessments of Mathematics and English Reading (NAMER) 2014. At the post‑primary level, the strategy sets Programme for International Student Assessment (PISA) targets, and targets for the number of students taking higher level mathematics in Junior Cycle and Leaving Certificate Examinations.
Table 6.1 summarises some of these targets and provides an overview of results in NAMER 2021 and PISA 2022. Out of the four targets evaluated for DEIS Urban Band 1 primary schools in NAMER 2021, one was achieved while the remaining three were not met. These targets were established before the COVID‑19 pandemic and, therefore, do not account for the disruption caused. Additionally, due to pandemic-related adjustments in the assessment process, data for four other DEIS Urban Band 1 school targets could not be collected. Despite these disruptions, the achievement gap between DEIS and non‑DEIS schools did not widen. The DoE is finalising a new Literacy, Numeracy and Digital Literacy Strategy. This will include updated indicators for literacy and numeracy development in DEIS schools.
Table 6.1. Targets in National Strategy: Literacy and Numeracy for Learning and Life
Primary schools |
|||||
---|---|---|---|---|---|
Level |
Class |
Target for 2020 all primary schools |
Value in NAMER 2021 |
Target for 2020 DEIS Urban Band 1 schools |
Value in NAMER 2021 |
Reading: at or above Level 3 |
Second class |
50% |
44.1% |
25% |
25.0% |
Sixth class |
50% |
N/A |
27% |
N/A |
|
Reading: at or below Level 1 |
Second class |
20% |
24.4% |
40% |
43.2% |
Sixth class |
20% |
N/A |
40% |
N/A |
|
Mathematics: at or above Level 3 |
Second class |
53% |
N/A |
30% |
N/A |
Sixth class |
50% |
41.4% |
27% |
22.4% |
|
Mathematics: at or below Level 1 |
Second class |
20% |
N/A |
45% |
N/A |
Sixth class |
20% |
27.3% |
42% |
48.6% |
|
Post-primary schools |
|||||
Level |
Target for 2020 all post-primary schools |
Value in PISA 2022 |
Target for 2020 DEIS Post-primary schools |
Value in PISA 2022 |
|
Reading: at or above Level 4 |
40% |
35.4% |
33% |
N/A |
|
Reading: at or above Level 5 |
12% |
10.2% |
10% |
N/A |
|
Reading: at or below Level 1 |
8.5% |
11.3% |
12% |
N/A |
|
Mathematics: at or above Level 4 |
36% |
26.0% |
29% |
N/A |
|
Mathematics: at or above Level 5 |
13% |
7.2% |
10% |
N/A |
|
Mathematics: at or below Level 1 |
10.5% |
18.9% |
16% |
N/A |
Note: The table only mentions targets specific to DEIS schools. NAMER 2021 assessed English reading achievement in the second class and mathematics achievement in the sixth class. Due to COVID-19 pandemic-related adjustments in the assessment process, data for four other DEIS Urban Band 1 school targets could not be collected. Values from PISA 2022 for DEIS post-primary schools have not yet been published.
Source: Department of Education and Skills (2017[6]), National Strategy: Literacy and Numeracy for Learning and Life 2011-2020, https://assets.gov.ie/24960/93c455d4440246cf8a701b9e0b0a2d65.pdf (accessed on 27 November 2023); Donohue et al. (2023[7]), Education in a Dynamic World: the performance of students in Ireland in PISA 2022, https://www.erc.ie/wp-content/uploads/2023/12/B23617-Education-in-a-Dynamic-World-Report-online-1.pdf (accessed on 17 June 2024); and Nelis and Gilleece (2023[8]), Ireland’s National Assessments of Mathematics and English Reading 2021: A focus on achievement in urban DEIS schools, https://www.erc.ie/wp-content/uploads/2023/05/B23572-NAMER-DEIS-report-Online.pdf (accessed on 27 May 2024).
In addition, the DoE publishes and co-operates on several other strategic documents, such as the “National Strategy for Higher Education to 2030” (Department of Education and Skills, 2011[9]), “Digital Strategy for Schools to 2027” (Department of Education, 2022[10]), “Traveller and Roma Education Strategy” (Department of Education, 2023[11]), and “Housing for All Youth Homelessness Strategy 2023‑2025” (Department of Housing, Local Government and Heritage, 2022[12]), among others. Each of these also has specific targets or identifies data collections that can be used to monitor the outcomes.
The DoE has a range of datasets to monitor student academic and well-being outcomes at the system level, although not all have been fully utilised for this purpose (section Challenges). It maintains electronic databases, namely the Primary Online Database (POD) and the Post-primary Online Database (PPOD), which serve as repositories for a wide array of information about primary and post-primary school students (Table 6.2). These databases capture essential demographic details such as name, address, Personal Public Service Number (PPSN), gender, date of birth and nationality. Furthermore, the DoE collects additional data voluntarily and with explicit written consent from parents/guardians or students over 18. The POD includes information on the child's religion, while the PPOD records data on the student's mother tongue. Both databases also capture data on ethnic and cultural backgrounds.1 The DoE also gathers a range of data on various aspects of the DEIS programme, including support and resources provided to DEIS schools and retention rates of students in schools. Primary schools must also report aggregate standardised test results from second-, fourth- and sixth-class levels (NCCA, 2017[13]). Junior Cycle and Leaving Certificate subject assessment results can be obtained from the State Examinations Commission at the individual student level (Department of Education, 2024[14]). These data also contained an indicator of fee waiver, which was used as a proxy for socio‑economic background (Weir and Kavanagh, 2018[15]). However, since 2020, examination fees have been waived for all students, making this indicator no longer valid as a proxy for socio‑economic background (Department of Education, 2022[16]). The PPOD database is used to publish regular reports on the retention of students at the post‑primary level (Department of Education, 2023[17]).
Some of these data are analysed on behalf of the DoE by the Educational Research Centre (ERC), which conducts research, assessment and evaluation across all levels of the education system in Ireland (Department of Education, 2024[14]). Indeed, centrally-held administrative data have been examined to consider changes input and output variables, such as retention rates in post‑primary schools and class sizes in primary schools (Kelleher and Weir, 2017[18]; Weir and Kavanagh, 2018[15]). The centre also collaborates with various agencies and initiates independent research projects. This includes involvement in international large‑scale assessments and NAMER, further enriching the depth and breadth of education‑related data available for analysis and policy formulation. These studies assess reading, mathematics and science at national and international levels, and collect contextual information from students, parents, teachers and principals (Table 6.2).
Table 6.2. Overview of key datasets available to the Department of Education
Name |
Overview of information provided |
---|---|
Administrative data |
|
Primary Online Database |
Name, address, Personal Public Service Number (PPSN), gender, date of birth and nationality, religion*, ethnic and cultural background*. |
Post-primary Online Database |
Name, address, PPSN, gender, date of birth and nationality, mother tongue*, ethnic and cultural background*. |
Standardised assessments results |
Aggregate primary school-level data from second, fourth and sixth classes. Individual-level subject grades at the post‑primary level. These data also contain an indicator of fee waiver, which was used as a proxy for socio‑economic background**. |
School attendance |
All state‑aided schools report aggregate school‑level data on total days lost by all students, total students absent for 20 or more days, total students suspended, and total students expelled. Data on students aged over six and under 16 who were absent for a cumulative total of 20 or more days by the reason for their absence collected at the student level. The data are gathered using the Tusla Portal hosted by Tusla - Child and Family Agency. |
National large-scale assessments/surveys |
|
National Assessments of Mathematics and English Reading (NAMER) |
Primary school students in selected second and sixth classes participate in the mathematics and English reading assessment. Students, parents, teachers and principals also complete questionnaires to gather contextual information. |
Evaluation of the Digital Learning Framework (DLF) |
The ERC evaluation of the implementation of the DLF involves collecting information from schools at baseline, and subsequently following the progress of representative samples of primary, post-primary and special schools with surveys of school staff and focus groups with school staff and students over multiple waves from 2019-2021. |
International large-scale assessments |
|
Progress in International Reading Literacy Study (PIRLS) |
International assessment of reading of fourth-grade students. The study also collects contextual information from students, parents, teachers and principals. |
Programme for International Student Assessment (PISA) |
International study of education examining reading, mathematics and science of 15‑year‑olds, along with contextual information from students, parents and principals. |
Trends in International Mathematics and Science Study (TIMSS) |
International assessment of mathematics and science of fourth and eighth-class students. The study also collects contextual information from students, parents, teachers and principals. |
Note: * Collected voluntarily and with explicit written consent from parents/guardians or students over 18. ** Examination fees have been waived for all students since 2020, making this indicator no longer valid as a socio-economic background proxy (Department of Education, 2022[16]). The table aims to provide an overview, not an exhaustive list.
Several studies have also measured the well-being outcomes of students. For instance, the Growing Up in Ireland survey was used to monitor the outcomes of students from various backgrounds over time. Researchers examined the risk and protective factors for the mental health and well-being of children and young people at the age of 9 and then 13 (Nolan and Smyth, 2021[19]; Smyth et al., 2023[20]). The research examined both positive (life satisfaction) and negative (socio-emotional difficulties) aspects of mental health and well‑being (ibid.). Children’s School Lives Study follows two age cohorts – children who started the second class in 2018 and children who transitioned from preschool into Junior Infants (see Chapter 1) in 2019 (NCCA, UCD Dublin, n.d.[21]). Commissioned by the National Council for Curriculum and Assessment (NCCA) and conducted by the University College Dublin School of Education, it aims to understand children’s learning, well-being, engagement, and experiences of equality, diversity and inclusion (ibid.).
Furthermore, in carrying out NAMER, PIRLS, PISA and TIMSS, the ERC includes surveys of families, students and school staff to provide a contextual background and broader picture of the well‑being of students from different socio-economic backgrounds. For instance, a study based on PISA 2018 results examined students' characteristics, home environments and parents' involvement in education. The researchers also considered school factors related to diversity of intake, resources, practices and school climate. They presented findings on non-cognitive outcomes and dispositions (well-being, attitudes and aspirations) (Nelis et al., 2021[22]). In NAMER 2021, researchers considered second- and sixth-class primary students’ achievement in relation to their characteristics, the characteristics of their schools and teachers, and access to and use of school resources (Gilleece and Nelis, 2023[23]).
Monitoring and evaluation at the school level
Ireland has a comprehensive school evaluation infrastructure. According to PISA 2022, 92.7% of students attended schools where principals reported that external evaluation exists as an arrangement aimed at quality assurance and improvements (either mandatory or on school’s initiative), compared to 77.6% on average across OECD countries (Figure 6.1). Furthermore, all principals in Ireland reported that self‑evaluation processes occur at their schools, compared to the OECD average of 95.3%.
In Ireland, the Inspectorate is responsible for external evaluation, although, as elaborated later, part of the external school review process is strengthening and promoting self-evaluation mechanisms. There were 1 820 inspection and advisory activities in the 3 095 primary schools and 851 inspection and advisory activities in 728 post‑primary schools during the 2022 school year. 608 of the 1 820 inspection activities in primary schools took place in DEIS primary schools and 277 of the 851 inspection activities took place in DEIS Post‑primary schools. Inspections are planned at the central and regional levels according to various criteria, including selecting schools on a risk basis (OECD, 2020[26]). Areas of enquiry include the quality of teaching and learning, the quality of leadership and management, and the quality of support for well‑being (Department of Education, 2022[27]; Department of Education, 2022[28]). Outcomes are publicly available on the website of the DoE and shared with education authorities (Department of Education, 2023[29]). The Inspectorate also conducts education‑focused inspections in publicly‑funded early childhood education and care (ECEC) settings (Department of Education, 2024[30]). Tusla - The Child and Family Agency (Tusla), the statutory regulator, inspects for compliance with education and care regulations in all ECEC settings (ibid.).
School self-evaluation (SSE) has received growing attention since 2012, when it became compulsory for all schools in the Irish education system. The SSE evaluation framework sees external and internal evaluation as complementary contributors to school improvement and capacity building. To support this, the Inspectorate published “Looking at Our School 2022: A Quality Framework for Primary and Special Schools” and “Looking at Our School 2022: A Quality Framework for Post-Primary Schools” (Department of Education, 2022[27]; Department of Education, 2022[28]). These frameworks provide a shared understanding of what effective and highly effective learning, teaching, leadership and management practices look like in the Irish school system, and a coherent set of standards that are used to inform both internal SSE and external inspection.
The Inspectorate also published “School Self-Evaluation Next Steps: September 2022 - June 2026” (Department of Education, 2022[31]). This publication is designed to further assist all schools to make SSE as effective as it can be to meet the needs of the children and young people they serve.
In essence, SSE is a collaborative, internal reflection, review and planning mechanism to advance various teaching, learning and well-being aspects. The focus of the SSE process varies across different types of schools in light of their context and the differentiated policy expectations nationally. Schools in the DEIS programme use the DEIS themes as the main focus of SSE. As part of this process, schools are asked to set specific, measurable, achievable, realistic and time‑specific targets, and to evaluate these annually by monitoring the impact of actions undertaken in the key DEIS themes. A school’s DEIS Action Plan for Improvement is its school improvement plan for SSE, and no additional or separate plan is necessary (see section The system emphasises the role of self-evaluation for school improvement for more information) (Department of Education, 2023[32]).
Strengths
Ireland has strong expertise in monitoring and evaluation in regard to DEIS
The DoE is committed to monitoring and evaluation, highlighted by collaboration and close integration with the ERC, and evaluation of various pilot programmes. The system is further enriched by research initiatives undertaken by other external organisations. This is underpinned by heightened awareness of educational inequalities, their causes and consequences among Irish politicians (Reay, 2022[33]). For example, a report by the then Joint Committee on Education and Skills, a parliamentary body shadowing the DoE and other departments, acknowledged that “the current structure, where there is an unequal distribution of income and wealth, is being legitimised through the ideologies of meritocracy, and is acting to reproduce social class related inequalities” (Houses of the Oireachtas, 2019, p. 20[34]). This ecosystem contributes to Ireland's strong expertise in monitoring and evaluating education. As a result, the DoE has a wealth of evaluations at its disposal. These studies are quantitative and qualitative, and focus on primary and post‑primary schools in and out of the DEIS programme. They also look at factors beyond student performance.
A strong expertise in monitoring and evaluation is exemplified by close collaboration with the academic and research sector, most notably the ERC. Established in 1966 and designated as a Statutory Body in 2015, the ERC collaborates closely with the DoE, undertaking research at all educational levels (Department of Education, 2024[14]). The ERC provides data for evidence-based decision-making through its extensive portfolio, including PIRLS, PISA, TIMSS and NAMER assessments. Moreover, the ERC collaboration with the DoE on the evaluations of the DEIS programme exemplifies the synergy between academia and policy making. The ERC also engaged in subject-specific evaluations, examining, for instance, the impact of schemes like the Home School Community Liaison (HSCL) Scheme (Weir et al., 2018[35]). The DoE and ERC commitment to transparency is manifested through the publication of this diverse range of reports, fostering a culture of informed discourse and accountability. Indeed, the DoE (and Inspectorate’s) close partnership with the ERC is a significant strength at both system and school levels. The national and international assessments facilitated by the ERC provide valuable steering data for the system, and the research is also combed into practice-centred findings that can help change approaches to teaching and learning. Furthermore, the ERC publishes guides for practitioners that help ground the research conclusions in practical terms for schools and teachers.
The DoE commitment to innovative educational initiatives is further evidenced by a range of pilot programmes, each targeting specific needs within the educational sector. The Rutland Street Pre-School Project is an early example, piloting methods later adopted in the Early Start project for ECEC (Department of Education, 2021[36]). The Droichead induction programme for newly qualified teachers, developed after a pilot, now plays a crucial role in professional development (Smyth et al., 2016[37]). Monitoring of the pilot involved distributing questionnaires and conducting interviews across participating schools, which provided data to explore the experience and effectiveness of the programme (ibid.). Furthermore, initially piloted and expanded, the Substitute Teacher Supply Panel Scheme reflects the DoE adaptive approach to addressing practical challenges in schools (Department of Education, 2022[38]). Even the HSCL Scheme (Chapter 5), now an essential programme for fostering partnerships between parents, teachers and the community to improve educational outcomes, started as a pilot (Weir et al., 2018[35]). The Scheme has been reviewed several times since its introduction as a mainstream intervention in 1993 (Archer and Shortt, 2003[39]; Ryan, 1994[40]; Weir et al., 2018[35]). Researchers collected questionnaire data from HSCL Coordinators on, e.g. time spent on activities relating to parents, activities relating to teachers and community-related activities (ibid.). Student achievement data were also collected as part of some of the reviews with achievement gains for some students (Weir et al., 2018[35]). Further details on these initiatives are explored in Chapters 4 and 5.
Furthermore, monitoring and evaluation are deeply integrated into the education system, with various institutions and organisations outside the public sector commissioning relevant research. For instance, Educate Together, a charity and patron of a network of over 100 schools, commissioned an evaluation of the Nurture Schools project to build resilience and improve children’s social, emotional and mental health and well‑being (Educate Together, 2023[41]). AsIAm, a charity helping people with autism, conducted research on school absence and withdrawal among children with autism (AsIAm, 2019[42]). The Irish Second-Level Students' Union surveyed students to highlight their views on the recently reformed Leaving Certificate Applied programme (ISSU, 2023[43]). Teaching Council, the regulator of the teaching profession in Ireland, promotes a culture of shared learning in which research and leading practice are encouraged and applied within the classroom setting (Teaching Council, n.d.[44]). To this end, the Council developed the Collaboration and Research for Ongoing Innovation Research Series to support a culture of shared learning and evidence-informed practice, and the Research Bursary Scheme that offers support to teachers wishing to carry out new research (Teaching Council, n.d.[44]; Teaching Council, n.d.[45]).
Studies and evaluations of the DEIS programme use a wide range of quantitative and qualitative sources
Evaluations of the DEIS programme at the primary level used a wide range of databases to estimate the programme's impact, from administrative data through large-scale national and international assessments to samples collected for specific research purposes. Quantitative approaches were both longitudinal and cross-sectional. Some researchers also considered various contextual factors to better discern the differences between DEIS and non‑DEIS schools. Generally, the studies show that no matter the subject and class tested, students in the most disadvantaged DEIS Urban Band 1 schools underperform their peers in DEIS Urban Band 2 schools (Cosgrove and Creaven, 2013[46]; Kavanagh and Weir, 2018[47]; Kavanagh, Weir and Moran, 2017[48]; McGinnity, Darmody and Murray, 2015[49]). In regard to primary DEIS Rural schools, evidence points to lower scores compared to non‑DEIS schools, but, depending on the study, the difference can be either non‑significant (Cosgrove and Creaven, 2013[46]; Delaney et al., 2023[50]; Gilleece, 2015[51]) or can disappear once taking into account a range of factors, including socio‑economic background of students (Cosgrove and Creaven, 2013[46]; Gilleece, 2015[51]; McCoy, Quail and Smyth, 2014[52]). In contrast, the difference for DEIS Urban Band 1 schools often remains even after taking into account other factors, such as student social background, school resources, teacher factors, school climate and student engagement (Cosgrove and Creaven, 2013[46]; McCoy, Quail and Smyth, 2014[52]; McGinnity, Darmody and Murray, 2015[49]). There is also evidence suggesting that gaps between DEIS and non‑DEIS schools, after taking into account a range of factors including socio‑economic background of students, are decreasing over time, although this result only holds for mathematics and not reading in NAMER (Karakolidis et al., 2021[53]; Karakolidis et al., 2021[54]), and in mathematics and science in TIMSS (Duggan et al., 2023[55]). However, these results are not universal (e.g. see Karakolidis et al. (2021[54])). In contrast, the relationship between home resources for learning (a proxy for socio‑economic background) and mathematics and science performance became stronger in the more recent cycles of TIMSS, even taking into account the DEIS status of primary schools (Duggan et al., 2023[55]).
Longitudinal studies were used to discern any improvements or regression in scores for a particular cohort of students. Depending on the study, year and sample size, conclusions broadly maintain that gaps between DEIS and non‑DEIS schools are not widening (Kavanagh and Weir, 2018[47]; Smyth, McCoy and Kingston, 2015[56]). Some studies indicate improved students' literacy and numeracy test scores in DEIS primary schools over time (Kavanagh, Weir and Moran, 2017[48]; Smyth, McCoy and Kingston, 2015[56]). Moreover, the results revealed a decrease in the percentage of students in DEIS Urban schools scoring below the 10th national percentile and a slight increase among the top 10th national percentile, indicating that the decline in low scorers was not achieved at the expense of a reduction in high scorers (a possibility if an exclusive focus was placed on raising the achievement of lower-achieving students) (Weir et al., 2017[57]). However, researchers also observed a significant heterogeneity in school performance over time and some schools experienced a decrease rather than an increase in mathematics test scores (Kavanagh, Weir and Moran, 2017[48]).
Beyond student performance, attendance in primary schools has also seen improvements (Smyth, McCoy and Kingston, 2015[56]), although the results refer to before the COVID‑19 pandemic. More recent statistics suggest that the gaps between DEIS and non‑DEIS schools in some attendance indicators are widening (Tusla, 2023[58]). For instance, DEIS schools have traditionally experienced higher rates of students absent for 20 or more days. In 2019/20, the rate of 20-plus day absences in DEIS Urban Band 1 schools stood at 12.1%, compared to 5.0% for all primary schools (a gap of 7.1 percentage points). This rate increased to 27.2% in 2020/21, compared to 11.1% for all schools (16.1‑point gap). For the 2021/22 school year, the absence rate stood at 57.6% compared to the national rate of 40.3% (17.3‑point gap). However, the data quality is insufficient to make strong conclusions as school response rates were relatively low and the understanding of the requirement to record absences during the COVID‑19 pandemic varied between schools (ibid.). Thus, an in‑depth analysis which includes data from post‑COVID years is needed to draw robust conclusions. Tusla Education Support Service (TESS), in partnership with the DoE, has also launched the National School Attendance Campaign 2023, and schools were provided with a once-off payment to promote and support regular school attendance through the Attendance Campaign Support Grant for Primary and Post-Primary Schools (Department of Education, 2023[59]; Department of Education, 2023[60]).
At the post-primary level, analyses reveal a nuanced picture of the impact of the DEIS programme. National PISA analyses show that students in DEIS schools underperformed those in non‑DEIS schools (Donohue et al., 2023[7]; Gilleece et al., 2020[61]). Some researchers also suggested that the size of the gap has narrowed in reading until 2018 (it has not changed significantly in mathematics or science) (Gilleece et al., 2020[61]). Using administrative data, researchers concluded that the gap in average Junior Certificate Overall Performance Score was narrowing between 2002 and 2011 (Weir et al., 2014[62]). Moreover, the Overall Performance Score of DEIS schools grew faster following the introduction of the DEIS programme in 2006/7 (ibid.). However, this improvement is inconsistent across all subjects, and the introduction of DEIS resources from 2008 to 2011 did not coincide with a significant increase in Junior Certificate mathematics performance (ibid.). A more recent study based on the same data looking at the 2002‑2016 period indicates a positive trend of progress for students in DEIS Post-primary schools, showcasing a closing of the achievement gap in overall performance scores, English and mathematics (Weir and Kavanagh, 2018[15]). Despite the progress, students in DEIS schools consistently achieve lower average mathematics and science results than their peers in non-DEIS schools (Gilleece et al., 2020[61]).
The analyses also highlight a substantial social context effect. This indicates that being a student in a school with high concentrations of socio‑economically disadvantaged students is significantly and negatively associated with achievement over and above the student’s own socio‑economic status (Weir and Kavanagh, 2018[15]). Furthermore, more recent research has shown variation in the association between student achievement and school socio-economic composition across the achievement distribution with a stronger association at its lower end, particularly in reading (Flannery, Gilleece and Clavel, 2023[63]). As outlined in other chapters, this suggests the need for integrated policies in education, housing and labour markets.
Beyond academic performance, DEIS Post‑primary schools have demonstrated reductions in the total number of days lost through student absence and in the number of students absent for 20 days or more between 2015/16 and 2016/17 (Millar, 2017[64]). However, a more recent analysis shows that many principals in DEIS schools viewed unauthorised student absenteeism as a hindrance to learning (Nelis et al., 2021[22]). The COVID‑19 pandemic has also impacted school absenteeism (Tusla, 2023[58]). In 2019/20, the rate of 20-plus day absences in DEIS schools stood at 17.1%, compared to 7.7% for all post‑primary schools (a gap of 9.4 percentage points). This rate increased to 23.1% in 2020/21 in DEIS schools, compared to 9.7% for all schools (13.4‑point gap). For the 2021/22 school year, the rate in DEIS schools stood at 36.8% compared to the national rate of 24.5% (12.3‑point gap) (ibid.). However, further research and policy discussion are needed to identify ways of responding to and supporting schools with high levels of student absenteeism, as these data suffer from poor response rates from schools (ibid.). The understanding of the requirement to record absences during the COVID‑19 pandemic also varied between schools.
Evidence exists that the gap between DEIS and non-DEIS schools in retention rates has narrowed over time (Smyth, McCoy and Kingston, 2015[56]; Weir and Kavanagh, 2018[15]), and the DoE publishes regular updates on gaps in retention rates between DEIS and non‑DEIS schools, as well as by socio‑economic status (Department of Education, 2023[17]). The DEIS gap in retention rate until the Leaving Certificate has fallen from 15.6 percentage points for the 2003 entry cohort to 8.4 points for the 2016 cohort (Department of Education, 2023[65]) (see also Chapter 1). Finally, in terms of student well-being outcomes, broadly speaking, there were no significant differences between students in DEIS and non‑DEIS post-primary schools, whether looking at meaning in life, self-efficacy or bullying (Nelis et al., 2021[22]).
Evaluations of the DEIS programme were also qualitative. For instance, early evaluations of the DEIS programme included focus groups with school staff (Weir and Archer, 2011[66]). These revealed, for example, widespread approval for the role of school development planning. During visits, the Inspectorate also regularly interviews principals, teachers, students and parents/guardians. Based on these interviews, evaluations of the DEIS programme benefitted from a combination of quantitative and qualitative data.
However, as elaborated in the section on Challenges, none of the studies included an identification strategy (e.g. with a control group) that would enable causal implications of the DEIS programme on student outcomes. The second policy recommendation outlines methodologies that could help estimate the causal effects of the programme, although application of such methods is contingent on the availability of appropriate data.
The system emphasises the role of self-evaluation for school improvement
School self-evaluation (SSE) is a crucial aspect of educational practice among OECD countries, reflecting a commitment to continuous improvement. Internally driven by school community members, SSE systematically examines and reflects on current practices, steering towards future goals (Barry et al., 2022[67]). SSE places the entire learning organisation under scrutiny, emphasising improvement and reflection as primary objectives (Brady, 2019[68]; Skerritt and Salokangas, 2019[69]). School self-evaluation is a long-established process in OECD education systems. In some, the practice is required by law, while in other countries, it is recommended or required only indirectly (e.g. by developing school guidelines) (European Commission/EACEA/Eurydice, 2015[70]; OECD, 2015[71]). The prevalence of SSE is underscored by the fact that, in 2022, 19 out of 34 OECD education systems provided guidelines for assessing equity and inclusion within the SSE framework (OECD, 2023[3]).
The significance of SSE lies in its ability to empower schools to analyse their strengths and weaknesses (OECD, 2015[71]). Internal evaluation, however, can go beyond mere assessment, fostering ownership of change and sensitivity to areas needing improvement (Godfrey, 2020[72]). It serves as a valuable tool for identifying continuing professional learning needs for teachers and promoting on-going advancement in instructional practices. In the realm of equity and inclusion, SSE can become a catalyst for positive change. The process can lead to revisions in curriculum content or organisation, provision of targeted support for specific student groups, and the identification of barriers hindering inclusive education (OECD, 2023[3]). Analysing aspects such as school climate, relationships, learning support and barriers to continuing professional learning, SSE becomes a mechanism for schools to identify and address challenges in creating an inclusive environment. SSE can also foster increased reflection. Indeed, an initiative in 2022‑23 under the Irish Presidency of the European Schools involved Irish higher education institutions providing teachers in the European Schools network with reflective practice trigger papers, tools, collaborative techniques, approaches to assist professional engagement among teachers, and supports for collaborative school improvement (Department of Education, 2024[14]).
A well-structured framework for school self-evaluation exists in Ireland
Since its formal integration into the Irish school system in 2012, the SSE process has become a cornerstone in enhancing the quality of education for students in Ireland. Governed by Looking at Our School (LAOS) 2022 quality frameworks and informed by “School Self-Evaluation: Next Steps September 2022 – June 2026”, SSE serves as a collaborative, internal reflection, review and planning mechanism aimed at advancing various aspects of teaching, learning and well-being (Department of Education, 2022[31]; Department of Education, 2023[73]). The SSE process is grounded in a culture of critical reflection and inquiry. This culture is also nurtured through crucial elements such as teacher professionalism, sharing of classroom practices, authentic assessment, developmental classroom observation, and professional feedback and peer learning (Department of Education, 2022[31]). Furthermore, authentic engagement with students and parents is integral to building a supportive SSE culture.
The SSE process follows a structured six-step framework, allowing schools to adapt it to their specific context and focus areas (Department of Education, 2022[31]). The process begins with collaboratively identifying a focus for SSE, ensuring its scope significantly impacts students’ learning and well-being, teaching quality, school leadership, and provision for equity and inclusion (Figure 6.2). Subsequently, schools gather evidence through various qualitative and quantitative sources, ensuring it is manageable, valuable and focused. The analysis and judgment phase involves bringing together the collected evidence, identifying central themes and reflecting on findings in reference to the LAOS quality frameworks (Department of Education, 2022[31]; Department of Education, 2023[73]).
As part of the SSE process, schools must undertake a well-being review underpinned by the “Wellbeing Policy Statement and Framework for Practice 2018-2023” (Department of Education and Skills, 2019[74]). Indeed, in the quality framework underpinning SSE and the work of the Inspectorate, student well-being has been recognised “both as an outcome of learning and as an enabler of learning” (Department of Education, 2022, p. 6[27]). Especially pertinent after the COVID‑19 pandemic, this SSE frameworks provide tools and resources for schools to explore ways of promoting student well‑being (Department of Education, 2022[27]; Department of Education, 2022[28]). These are complemented by materials provided by the National Educational Psychological Service, and curriculum materials developed by the National Council for Curriculum and Assessment (NCCA) to support well‑being in Early Years, Primary and Post-primary Curricula (Department of Education, 2023[75]; NCCA, 2024[76]).
Once the analysis is complete, schools write and share a report and improvement plan, documenting the main findings and agreed-upon improvement actions. The annual report and improvement plan are shared with board members and staff, with considerations for sharing critical points with parents and students. Notably, the SSE report and improvement plan are intended as internal tools for school development rather than broader public communication (Department of Education, 2022[31]).
The subsequent steps involve implementing the improvement plan and monitoring actions while evaluating their impact. Clarity on responsibility for implementation, timeframes and methods for monitoring impact is crucial during these phases (Department of Education, 2022[31]). These steps ensure that SSE outcomes have a tangible and positive effect on learning and teaching experiences, including students’ well-being.
Three levels of support are available to further aid schools in their SSE processes (Department of Education, 2022[31]). Level 1 offers regional SSE information sessions, support and advisory visits, webinars, presentations, newsletters and advisory engagements tailored to the schools’ context. Level 2 envisions a complementary relationship between internal and external evaluation, aiming for a two-way flow of information. Level 3 encourages collaboration within and among schools. For instance, the Shared Evaluation for Learning Project brings together the Inspectorate and school leadership in a small sample of schools to collaboratively evaluate the quality of an aspect of teaching and learning in the school (ibid.). As mentioned in the section on Contextual background, most students attended schools where principals reported undertaking self-evaluation in 2022. Earlier results also show that DEIS schools exhibited a slightly higher percentage (97%) of self‑evaluation than non‑DEIS schools (95%) (Shiel et al., 2022[77]).
School self-evaluation is viewed as a necessary and inherently positive process in the DEIS programme
The DEIS action planning process is pivotal in driving systematic improvements in schools receiving additional support and resources through the DEIS programme. Since its introduction in 2005, DEIS has mandated schools to engage in a comprehensive self-evaluation action planning process, focusing on specific improvement themes (Department of Education, 2022[78]). The DEIS action planning process involves developing a three-year improvement plan, addressing key themes such as attendance, retention, literacy, numeracy, supporting educational transitions, partnership with parents and others, examination attainment (post-primary schools only), leadership, well-being, and continuing professional learning. It focuses on how the school intends to ensure that its DEIS supports (see Chapter 1) are targeted at students most at risk of educational disadvantage. The emphasis on SMART targets (specific, measurable, achievable, relevant and time-bound) aims to ensure a clear, focused and strategic approach to improvement efforts (ibid.).
Integral to the DEIS action planning process is the involvement of students, parents, local communities and agencies operating at the local level (Department of Education, 2022[78]). This collaborative dimension is essential in outlining strategies and interventions to achieve SMART targets, ensuring that interventions are designed to meet the needs of the most-at-risk students. Furthermore, the process underscores the importance of targeting DEIS support, including using the DEIS grant, toward students most at risk of educational disadvantage (Department of Education, 2023[32]).
In alignment with the broader context of SSE, DEIS schools must engage in a six-step SSE process to devise their DEIS action plan (Department of Education, 2022[78]). This process involves gathering evidence, analysing data, setting priorities for development and improvement, writing and sharing the plan, implementing the plan, and evaluating its impact (ibid.). The annual review allows schools to examine progress, assess target achievement, and refine plans based on changing educational needs. DEIS schools are not obliged to operate a parallel planning process involving one set of plans for DEIS and another for SSE (Department of Education, 2018[79]).
The Inspectorate also evaluated DEIS action planning in 2017‑2020 (Department of Education, 2022[78]). The effectiveness of the process was underscored by a strong culture of planning for improvement observed in the inspected schools. Principals had established structures, such as appointing a DEIS coordinator and establishing DEIS teams, to promote planning for improvement as a shared responsibility among the leadership team and staff. In the most effective DEIS schools, the DEIS action plan became integral to the core work of the school, particularly in shaping teaching and learning. Teachers’ planning and subject plans often reflected the DEIS action plan, indicating a seamless integration of whole-school DEIS strategies, especially in primary schools. Other successful elements included setting high expectations for all students, using explicit teaching strategies, collaborative teaching practices, and evidence-informed interventions to bolster literacy, numeracy and well-being. Notably, DEIS action planning served as a mechanism for schools to manage change and develop their agenda for school improvement. It fostered ownership among teachers, promoting a shared commitment to the change agenda and overall school improvement (ibid.).
The Inspectorate serves a vital role in school evaluation
The Inspectorate in Ireland is crucial in ensuring the standards and quality of education provision across various educational settings, including ECEC settings, primary, post-primary and special schools, and others (Department of Education, 2023[80]). The primary objective of the Inspectorate is to assure quality and public accountability within the education system. This is achieved through a multifaceted approach, including school inspections, focused or thematic evaluations and the publication of various reports (ibid.). Indeed, one of the critical functions of the Inspectorate is the publication of inspection reports on schools. In Ireland, the Inspectorate conducts various types of inspections across primary and post-primary schools (Department of Education, 2023[29]):
Whole school evaluations assess the overall quality of school management, leadership, teaching, learning and assessment, with variations in processes for primary and post-primary schools;
Curriculum evaluations in primary schools focus on specific subjects, evaluating the quality of students’ learning, how the school supports learning, and the school’s planning for the subject;
Subject inspections in post-primary schools assess individual subjects, evaluating teaching, learning and departmental planning;
Programme evaluations inspect specific programmes in post-primary schools, such as Transition Year and the Leaving Certificate Applied programme, focusing on planning and teaching quality;
Follow-through inspections gauge a school’s progress in implementing recommendations from previous inspections;
Specialised or thematic inspections, with a research focus, are employed to examine specific subjects or issues, providing oral feedback and a written report to the school, and often contributing to national reports summarising identified trends;
Evaluation of inclusive practices and provision for students with additional and special educational needs in primary and post-primary schools evaluates the quality of inclusive practices in a school and the provision for students in receipt of additional support from the school;
Child protection and safeguarding inspections monitor the implementation of the Child Protection Procedures for primary and post-primary schools in a sample of schools annually;
Incidental inspections are unannounced inspections that evaluate aspects of the work of a school under the normal conditions of a regular school day; and
Evaluation of action planning for improvement in DEIS schools focuses on how schools devise, implement and monitor Action Plans for Improvement for the DEIS themes. It also enables inspectors to evaluate the effectiveness of schools implementing specific interventions and initiatives.
Reports are published on all inspections except incidental inspections. The Inspectorate reports provide a comprehensive overview of the quality of learning and teaching, offering findings, recommendations and examples of best practices. The Inspectorate's commitment to transparency is evident in its provision of oral feedback to the school community after inspections, coupled with the publication of detailed written reports on its website (Department of Education, 2023[29]). More information about how these publications feed into policy making is provided in Chapter 2.
The Inspectorate places a strong focus on the evaluation of DEIS schools individually and at the system level
As mentioned before, the Inspectorate carried out 1 820 inspection and advisory activities in 3 095 primary schools and 851 inspection and advisory activities in 728 post‑primary schools in 2022; 608 of the 1 820 inspection activities in primary schools took place in schools in the DEIS programme and 277 of the 851 inspection activities took place in DEIS Post-primary schools. As part of the inspection programme in DEIS schools, the Inspectorate also carries out evaluations of the quality of action planning for improvement in a sample of primary and post-primary schools (Department of Education, 2023[29]).
The Inspectorate’s goal is to provide DEIS Post‑primary schools with some form of inspection every two years, ranging from short, one-day unannounced incidental inspections, to more intensive whole‑school evaluations and inspections (see above for more details). The planning process for inspections at the primary level is based on a range of criteria, including length of time since previous inspection and recommendations from earlier reports. The inspection programme at primary and post-primary levels also includes follow-through inspections, which evaluate the progress that school leadership, in collaboration with the school community, has made in implementing some or all of the main recommendations made in an earlier inspection. Follow-through inspections typically happen within two years of the original inspection. However, for schools where significant challenges are identified, the follow‑up visit takes place sooner and may involve other inspection models (including, for example, subject and programme inspections, improvement monitoring or management evaluations).
The Inspectorate’s evaluation process also involves a comprehensive examination of DEIS schools, encompassing leadership, teaching quality and overall school improvement. It has developed a dedicated model, the Evaluation of Action Planning for Improvement in DEIS Schools, focusing on the effectiveness of school-based action planning processes in DEIS Urban Band 1 primary and DEIS Post-primary schools (Department of Education, 2022[78]). This model, which has been in use since 2010, involves annual evaluations, the findings of which are published in composite reports.
The “Looking at DEIS Action Planning for Improvement in Primary and Post-Primary Schools” publication is the first of three reports intended to review and evaluate the implementation of the DEIS Plan 2017 (Department of Education, 2022[78]). This report provides insights into various aspects, including school life, leadership of DEIS action planning, and the quality of teaching, learning and professional development. The subsequent reports will delve into DEIS action planning for literacy, numeracy and examination attainment, and themes like attendance, retention, transitions, and partnership with parents and the school community. The report emphasises the importance of shared responsibility and ownership of the DEIS action plan within school leadership teams and staff. The distributed leadership responsibilities contribute to establishing structures that promote planning for improvement (ibid.).
The report’s findings highlight positive efforts in DEIS schools, with many interventions implemented to enhance literacy, numeracy and student well-being (Department of Education, 2022[78]). The commitment to creating inclusive classrooms is acknowledged, with differentiated supports provided within mainstream settings. Collaboration between teachers and special education teachers to meet students’ diverse needs was evident in primary and some post-primary schools (ibid.).
However, the report identifies areas for improvement, particularly in post-primary schools (Department of Education, 2022[78]). It recommends additional support for building inclusive school and classroom environments. Specifically, the National Council for Special Education was advised to provide assistance in implementing team teaching within mainstream classrooms and to offer guidance on the best methods for supporting differentiation in various subject areas (ibid.).
Challenges
Limited use of granular and combined administrative data
The section on Contextual background described the wealth of information available to the DoE and the Irish system’s strong expertise in monitoring and evaluation. However, despite the wealth of information available, certain gaps persist. The most prominent challenge is that the DoE has not yet fully reaped the potential of the data estimating the socio‑economic background of the areas where students reside based on HP Index scores (more on the HP Index in Chapter 1). HP Index scores are supplied to the DoE at the student and school levels. Yet, the DoE, research and academia have not yet utilised these data to a large extent, although some progress has been made in this regard in recent years (e.g. the DoE is now using HP Index scores to assess the impact of disadvantage on retention outcomes (Department of Education, 2023[65])).
It is recognised that not all students at risk of educational disadvantage are enrolled in schools in the DEIS programme, and the number/proportion of students at risk of educational disadvantage not in DEIS schools is currently unpublished at the population level. It is also publicly unknown what proportion of the most disadvantaged students are enrolled in DEIS schools. Likewise, there are students in schools in the DEIS programme who are not at risk of educational disadvantage. For instance, findings from PIRLS 2021 show that about 60% of participating students in DEIS Urban Band 1 primary schools were in the lowest socio‑economic quartile, while 5% were in the highest socio-economic quartile (Delaney et al., 2023[50]). In contrast, in non-DEIS schools, 19% of PIRLS students were in the lowest socio-economic quartile and 29% were in the highest (ibid.). All of these areas would benefit from a deeper analysis of the HP Index data interacted with the DEIS school status.
Non‑administrative data sources have been widely used to partially fill this gap. For instance, the Growing up in Ireland and NAMER surveys offer a wealth of background characteristics, including social class, parental education, household income and family structure, and an indicator of whether the student attends a DEIS school (Smyth, McCoy and Kingston, 2015[56]). However, surveys based on self-reported information suffer from missing data. In NAMER 2009, almost a fifth of students did not report their socio‑economic background (Eivers et al., 2010[81]). In NAMER 2021, socio-economic background information was unavailable for any students as the parent/guardian questionnaire was not administered due to changes in study procedures associated with the COVID-19 pandemic (Nelis and Gilleece, 2023[8]). In PIRLS 2021, 7% of parents did not complete the questionnaire that included socio‑economic information (Delaney et al., 2023[50]). In both NAMER 2009 and PIRLS 2021, the missing pattern was not completely random: those who did not report their socio‑economic background were often more likely to score lower in reading and mathematics (Delaney et al., 2023[50]; Eivers et al., 2010[81]).
Indeed, without access to population-wide student-level data on socio-economic background, research is often hindered by small sample sizes and non-response rates. For instance, analyses of available survey data suggest that gaps between DEIS and non‑DEIS schools can persist even after considering students’ socio-economic background (see section Studies and evaluations of the DEIS programme use a wide range of quantitative and qualitative sources). These contextual effects are often most pronounced in the most disadvantaged DEIS Urban Band 1 schools. In contrast, the findings suggest the absence of contextual effects in DEIS Rural schools, i.e. once the socio‑economic background of students is taken into account, there are no significant differences in reading and mathematics scores compared to non‑DEIS schools. This result holds even after accounting for other factors, such as school resources, teacher factors, school climate and student engagement. The authors sometimes suggest that there is a “threshold effect” where concentrations of disadvantage beyond a particular point result in lower levels of achievement (McCoy, Quail and Smyth, 2014[52]). Nonetheless, the limitations imposed by small sample sizes and non-response rates underscore the need for comprehensive student-level data to deepen the analysis of socio-economic factors in education research.
Furthermore, many of these analyses can only assess average outcomes, limiting the ability to discern variations in performance between DEIS and non-DEIS schools for specific student groups, particularly those from highly disadvantaged backgrounds (Smyth, McCoy and Kingston, 2015[56]). The achievement gap between disadvantaged and non-disadvantaged schools encompasses two components: (a) the disparity in achievement among individual students from diverse social backgrounds, and (b) the “multiplier effect”, i.e. the additional impact of the concentration of disadvantage within a school. Currently, the absence of individual student background data impedes the capacity of researchers to differentiate between these two components, precluding a comparison between students from disadvantaged backgrounds attending DEIS schools and those attending more socially mixed schools (ibid.).
The OECD review team learned that there are capacity, technical and legislative barriers to sharing, using and disaggregating administrative data. Capacity issues between public institutions in sharing data can arise from various factors. One challenge is the lack of standardised data formats and interoperability standards across different institutions. Public agencies often employ disparate systems and databases that may not seamlessly communicate with one another, leading to difficulties in exchanging information efficiently. Additionally, varying levels of technological infrastructure and resources among institutions can hinder their ability to implement and maintain robust data-sharing mechanisms. Furthermore, limited funding and budget constraints may impede the development of comprehensive data-sharing infrastructures and staff training to handle such initiatives. Indeed, the “Review of DEIS” report by the-then Department of Education and Skills identified that “a specific data analytics function” is required in the DoE, which is “properly resourced with appropriately qualified staff to manage and interrogate the data as required for on‑going [DoE] business needs” (Department of Education and Skills, n.d., p. 36[82]).
Data security and privacy concerns further complicate the sharing process, as institutions must navigate complex legal and ethical frameworks to ensure compliance. In Ireland, as in many other countries, it is feared that data may be misused to, e.g. maintain or deepen power relationships between majority and minority population groups (Balestra and Fleischer, 2018[83]; Durante, Volpato and Fiske, 2009[84]; Simon and Piché, 2012[85]). This is of particular concern in countries where, e.g. ethnicity-based data was used in the past to provide the basis for discriminatory practices, and for groups that have in the past experienced ethnic profiling, segregation, genocide and violence (Balestra and Fleischer, 2018[83]).
Despite these challenges, the DoE is engaging in analysis of administrative data including, to some extent, the HP Index. For instance, the DoE tracks the 2016 cohort in terms of retention rates, where students are disaggregated by the level of affluence based on the HP Index (Department of Education, 2023[65]). In regard to other administrative data sources, the ERC has used the possession of a medical card as a proxy for student‑level socio‑economic background. Medical card holders can get certain health services free of charge. Additionally, the card holders receive an examination fee waiver (Weir and Kavanagh, 2018[15]). To qualify for a medical card, the income must be below a specific figure for the family size (Citizens Information, 2023[86]). However, since 2020 (when examination fees were waived in response to changes with state examinations due to the COVID-19 pandemic), information on student possession of a medical card has not been available in state examination databases (Department of Education, 2024[14]).
Absence of a control group and causal implications in DEIS evaluations
A related challenge to the lack of granular and combined administrative data is the absence of a control group in evaluations of the DEIS programme. This drawback has been repeatedly identified in almost all studies that aimed to guide the programme’s impacts (Smyth, McCoy and Kingston, 2015[56]; Weir and Denner, 2013[87]; Weir et al., 2014[62]; Weir et al., 2018[35]). While the use of a control group may also have ethical implications (Golden, 2020[88]), its absence means that it is impossible to analyse truly comparable groups when looking at student outcomes and, as such, theoretically estimate the DEIS programme’s causal effects (Gilleece and Clerkin, 2024[89]; Kavanagh, Weir and Moran, 2017[48]). The challenge of establishing causality is not exclusive to the assessment of DEIS or educational evaluation in Ireland. It is acknowledged as a significant hurdle in evaluating educational policy initiatives internationally (European Commission, Directorate-General for Education, Youth, Sport and Culture, 2022[90]; Golden, 2020[88]).
Estimating the causal impacts of the DEIS programme is, of course, inherently challenging due to the presence of numerous pre-existing educational initiatives (Gilleece and Clerkin, 2024[89]). Given the extensive history of prior programmes, the difficulty lies in developing a strategy for identifying the causal mechanism, i.e. precisely identifying the commencement of DEIS and the “treatment group” (ibid.). Moreover, the successes attributed to DEIS are intertwined with the cumulative impacts of earlier initiatives, such as Breaking the Cycle, making it intricate to isolate and attribute specific causal effects to the DEIS programme (INTO and Educational Disadvantage Centre, 2015[91]). Indeed, research suggests that there was a considerable overlap of schools having access to various programmes later integrated under the DEIS umbrella (Weir and Archer, 2005[92]). Furthermore, the success of the implementation of different programmes, before and after the introduction of the DEIS programme, might vary among schools (Gilleece and Clerkin, 2024[89]).
Other challenges relate to the potential indirect effects of the DEIS programme due to staff (and students) moving between DEIS and non‑DEIS schools with varying levels of expertise, continuing professional learning and experience (Gilleece and Clerkin, 2024[89]). Other indirect effects might relate to providing extra‑curricular activities (that might improve students’ academic and non‑academic outcomes) based on their enrolment in a DEIS school (ibid.). Furthermore, DEIS is not the only initiative implemented in the education system, and it might be challenging to disentangle the effect of DEIS from other policies, strategies and supports (ibid.). If unaccounted for, all these effects can bias even those methodologies that aim to provide causal estimates, including those involving control groups. In general, greater policy evaluation also assumes a conducive socio-political environment, as barriers to effective evaluation can arise from (political) conflicts, timing issues and other factors, which may hinder planning and the institutionalisation of evaluation practices (Golden, 2020[88]).
The absence of the control group and causal estimates have, however, important policy implications in terms of resourcing. Difficulties in establishing causal effects impede the accurate measurement of the DEIS programme’s impact on student outcomes and raise critical questions about the allocation of resources. Limited attention has been devoted to assessing whether the additional funding allocated to DEIS schools can narrow the resource gap between disadvantaged and non-disadvantaged schools (Smyth, McCoy and Kingston, 2015[56]). Without causal estimates, policy makers face challenges in determining whether observed changes in student achievement can be attributed to the DEIS programme or other factors and, as such, determining the value for money for the public investment. Consequently, the absence of causal estimates compromises the ability to make informed decisions in regard to optimising and allocating resources for educational interventions. This underscores the need for improved methodologies and data collection strategies to address these limitations and ensure a more rigorous evaluation of the DEIS programme’s effectiveness, ultimately guiding more effective resource allocation and shaping evidence-based educational policies.
There is little capacity for data-informed improvement planning in DEIS schools
DEIS schools are expected to gather evidence and analyse data as part of the six-step self-evaluation that underpins the DEIS action planning process. However, some post-primary schools’ capacity to collect, interpret and use data to develop evidence-informed improvement strategies remains limited. Inspection reports have highlighted schools’ ability to use “assessment data to inform teaching and learning, to monitor how effective different teaching strategies are and to highlight areas for professional learning for staff members” as a key element of success for DEIS schools (Department of Education, 2022, p. 42[78]).
At the school level, there is a wealth of data generated through assessments and the action planning process that can – if used effectively – help them to provide their students with the right supports at the right time (e.g. data on attendance, parental involvement, performance, etc.). This includes analysing data from formative and summative assessments against measurable targets, and other information, such as the views and perspectives of teachers, parents/guardians and students. However, the OECD review team’s impressions were aligned with the Inspectorate’s (Department of Education, 2022[78]) assessment that not all schools are confident in setting SMART targets and collecting and analysing data to evaluate their performance against them.
Overall, post-primary teachers in Ireland appear to have rapidly improved their digital skills throughout the COVID-19 pandemic, showing one of the most remarkable improvements among OECD countries. Between 2018 and 2022, the proportion of principals who agreed or strongly agreed that their teachers had the necessary technical and pedagogical skills to integrate digital devices into their instruction rose by more than 40 percentage points, from 49.3% to 95.3% (OECD, 2023[25]). In addition, 82.0% of principals were confident that teachers had effective professional resources to learn how to use digital devices (ibid.).
This improvement in digital literacy provides a sound basis for strengthening teachers’ and principals’ use of data for decision making. Stakeholders interviewed by the OECD review team concurred that the external support provided to schools had improved since the self-evaluation process was introduced in 2012/13. Professional Development Support for Teachers (PDST) (now Oide) offers post‑primary school leadership teams an opportunity to engage in a one-year programme on data and research-informed school planning (PDST, n.d.[93]) and the Inspectorate is offering advisory visits (upon school request) to support the robust use of school-level data for the action planning process. Nevertheless, the OECD review team formed the impression that most principals had not yet received sufficient training or guidance and were not aware or availing themselves of the support on offer.
A frequently reported impediment to the digital transformation of schools is a lack of technical support. In 2022, 63.2% of 15-year-old students were in schools where principals reported that they were lacking qualified technical assistant staff (Figure 6.3). This was among the highest proportions among OECD countries and significantly above the OECD average of 41.2%. This shortage of technical assistant staff was particularly felt in rural schools, where 76.9% of students were in schools where principals reported a lack. Although socio-economically disadvantaged schools were more likely to report lacking technical assistance than advantaged schools (66.0% vs. 58.9%), this difference was not statistically significant (OECD, 2023[25]).
This perceived lack of technical support has also been identified as a key impediment to embedding digital technologies in teaching, learning and assessment in the “Digital Strategy for Schools to 2027” (Department of Education, 2022[10]). Yet, the need to build digital capacity extends beyond the classroom and teachers’ use of digital education technologies. To ensure the effective use of data for school improvement more generally, capacity building will need to extend to the school leadership and beyond.
Policy recommendations
Implement more comprehensive data integration and analysis in education policy making
The effective use of data is pivotal for crafting policies that cater to the diverse needs of students. Recognising this, there is a growing emphasis on the need to harness the full potential of administrative data to better understand and address the challenges faced by students. To this end, enhancing the educational system’s efforts and capability to utilise detailed data more effectively is important. It comprises two fundamental strategies: strengthening the analysis of student-level HP Index information in the short‑term, and fostering stronger inter-departmental collaborations to expand the range of student background characteristics in the long‑term. In addition, it is recommended to improve monitoring by utilising standardised assessments. Together, these initiatives have the potential to transform the landscape of data utilisation, paving the way for more informed and effective decision making.
Such data integration could benefit the DoE and also a broader spectrum of stakeholders. For instance, enhancing data quality and integrating data from different datasets was highlighted among the recent key OECD recommendations for tackling child poverty and improving outcomes for children and young people, including implementing Young Ireland, the National Policy Framework for Children and Young People (0‑24) 2023-28 (OECD, 2024[94]).
Enhance the analysis of student-level HP Index data
While the primary goal of the DEIS programme is to address concentrations of disadvantage in schools, the OECD review team believes that understanding what proportions of disadvantaged students are targeted by the DEIS programme, accounting for demographic and economic changes, is crucial for informed decision making, especially in regard to the inclusion of more schools in the DEIS programme. Indeed, one of the recommendations in Chapter 3 is to examine scenarios to attenuate the adverse effects of key thresholds in the DEIS classification algorithm. Before rolling out such a system, accurate data on the level of disadvantage of students who are (and are not) likely to be targeted should be analysed.
Moreover, it is important to know the level of socio‑economic disadvantage among those who are and are not addressed by the DEIS programme. For instance, PIRLS 2021 data indicate that approximately one fifth (19%) of students in non-DEIS schools were in the lowest socio-economic quartile (Delaney et al., 2023[50]). Therefore, it would be important to use a proxy for socio‑economic background that distinguishes levels of disadvantage at a non‑binary base. The HP Index (Chapter 1) data, already available to the DoE, could provide a practical solution to this challenge, although, as mentioned below, other options should also be explored. Addressing these challenges could yield several benefits. While considerable resources have been invested in analysing the differences between rural and urban settings (Weir and McAvinue, 2013[95]; Weir, Errity and McAvinue, 2015[96]), it could lead to their nuanced understanding. Research suggests that urban and rural disadvantages differ both quantitatively and qualitatively (Smyth, McCoy and Kingston, 2015[56]). While urban schools may face academic challenges, rural schools and communities often contend with socio-economic and cultural exclusion impacting students’ holistic development (ibid.).
It could also provide more information on how students with a similar level of socio‑economic disadvantage fare within DEIS and non‑DEIS schools. For example, emerging research suggests that while overall performance gaps have decreased over time, these do not necessarily reflect improved equality. Indeed, only after interacting the DEIS status variable with a proxy for student‑level socio‑economic background, researchers found that the relationship between home resources for learning (a proxy for socio‑economic background) and achievement has strengthened (Duggan et al., 2023[55]). This can suggest that the inequality of opportunity linked to student-level socio‑economic factors has, in fact, increased over time (ibid.).
Furthermore, it could enhance analyses by focusing on various disadvantaged groups and interacting background characteristics with socio‑economic status at the student level. For instance, while the DoE and other departments place a great emphasis on Traveller and Roma students through specialised strategies (Department of Children, Equality, Disability, Integration and Youth, 2017[97]) and other publications (see, e.g. Department of Education (2023[98])), the OECD review team heard that there is a significant gap in understanding the experiences and challenges of Traveller and Roma students. For example, despite 71.9% and 55.0% of Traveller and Roma primary and post‑primary students enrolled in DEIS schools (Department of Education, 2023[99]), there has been no systematic assessment of their educational outcomes compared to other ethnic groups when considering socio‑economic background. This can hinder the ability to formulate targeted policies and interventions to address their needs.
There are objective limitations in working with individual data on ethnic and cultural background, such as the fact that, in 2016/17, for instance, almost a third of students in primary schools did not provide their ethnic and cultural background information (Tickner, 2017[100]). Such a high non‑disclosure rate can significantly impact the quality of the analyses and conclusions based on the data. Nevertheless, working with a broader dataset focusing on the socio-economic background of Traveller and Roma students in DEIS and non‑DEIS schools could provide critical insights into the factors influencing their educational trajectories. This could contribute to a more comprehensive understanding of the educational landscape for these student populations, facilitating the development of more effective and equitable education policies (OECD, 2023[3]).
Another advantage of accessing student-level data on socio-economic background is a better sampling procedure of DEIS schools in PISA. Differences in the identification system used for DEIS since 2017 (the HP index) and the socio‑economic indicator used for sampling in PISA (medical card status holders), may lead to some potential difficulties in using PISA data for monitoring outcomes in DEIS schools over time (Gilleece et al., 2020[61]). Some developments are already planned. For instance, the OECD review team learned that the ERC intends to examine the possibility of using HP Index data at the student level for analysis in PIRLS 2026, and the HP Index might also be used for sampling of PISA 2025 schools.
Strengthen inter-departmental discussions to broaden the pool of student background characteristics
While the HP Index could provide a practical short-term solution for getting a proxy for socio-economic background, it has several disadvantages. It is tied to Small Areas (see Chapter 1) rather than individual students/households. The HP Index aims to identify geographic areas of disadvantage and affluence, not individual-level disadvantage. Moreover, the HP Index is not a measure of poverty, although poverty and deprivation are closely correlated. As such, it does not distinguish between current poverty and the cumulative effects of persistent poverty over a child’s life, even though cumulative effects of poverty are associated with more detrimental educational outcomes (Chaudry and Wimer, 2016[101]). Furthermore, an analysis of the HP Index’s effectiveness using PISA 2018 and administrative data revealed that the HP Index “represents a reasonable option for use in the DEIS identification process” and provides “a reasonable approximation of the school socio-economic context” (Gilleece and McHugh, 2022, pp. 19-20[102]). However, in a limited number of cases, it can also lead to the misclassification of schools based on their socio‑economic status, and has a lower predicting power in regard to reading achievement compared to some selected alternative measures (Gilleece and McHugh, 2022[102]).2
Merging datasets could thus broaden the understanding of currently non‑observed aspects of socio‑economic disadvantage. For instance, combining and analysing other administrative sources with currently used databases (e.g. the HP Index), such as income data and social protection data, could provide a richer understanding of socio-economic contexts, and the complexity of the multifaceted challenges associated with educational disadvantage. It could improve the understanding of other dimensions of poverty and social inclusion, such as mental health needs and the role of grandparents in childcare (Downes, Pike and Murphy, 2020[103]). Chapter 3 also elaborates that the target effectiveness and efficiency of the DEIS programme could be improved by including additional dimensions of social disadvantage (e.g. psychological and socio‑emotional well‑being, cultural barriers and immigrant background).
Furthermore, access to other databases could lead to a quicker understanding of changing social and economic situations in particular areas. The HP Index’s reliance on census data has been welcomed as a move towards more objective and transparent criteria for determining which schools should be part of the DEIS programme. However, reliance on the census creates a five-year update cycle, and data from the census are available only several months after they are collected. Thus, the information may not capture rapid changes in certain areas. This poses a risk of overlooking emerging challenges. Having access to proxies of socio-economic background that are more responsive to societal and labour market changes, such as unemployment rates, income/poverty rates, etc., and at the same time are not dependent on the census, could provide a way to respond to challenges faced by schools inside or outside of the DEIS programme in a more reactive manner.
In the longer term, it is, therefore, important to engage in inter‑departmental discussions to access a wider pool of variables that proxy for socio‑economic status outside of those already established and used at the DoE. To this end, utilising other administrative databases could enhance the comprehensiveness of the socio‑economic proxy (Box 6.1). Such combined datasets do not necessarily need to be used to refine the DEIS identification model; they could be used for more general monitoring purposes of the DoE (and other departments). Indeed, changing the DEIS identification approach to react to every demographic change (and potentially temporary changes) might risk changing the programme's focus to tackle concentrated educational disadvantage, which might require a more extended time period.
Box 6.1. Combined administrative datasets
The Netherlands Cohort Study on Education
The Nationaal Cohortonderzoek Onderwijs (Netherlands Cohort Study on Education) is a longitudinal research initiative utilising register data on student track placement in primary and secondary education. The dataset is based at Statistics Netherlands, combined with other variables from other administrative registers. The first pillar explores students’ educational pathways, incorporating rich background information including age, gender, country of origin, parent’s marital status, household information, socio‑economic background of students and their parents, and other regional variables. The second pillar provides school-level data from the Dutch Ministry of Education and the Dutch Inspectorate of Education, encompassing school size, urbanisation level and school denomination. The third pillar involves microdata on student performance obtained from standardised assessments, offering insights into students’ progress in reading, spelling and mathematics between the ages of 8 and 12. Apart from research purposes, the database is also used to inform schools and school boards, among others, about students’ socio‑economic situation, performance and outcomes after they leave school.
Microdata from individual registers in Sweden
Statistics Sweden maintains administrative data in the country. Each Swedish resident is assigned a unique and permanent identification number at birth or point of immigration, which is recorded in each administrative database. Currently, individual registers in Sweden cover the labour market, population statistics (e.g. biological and adoptive links between persons), household finances and expenditures, income and taxation data, living conditions, electoral participation, and education and training data. Subject to approval and costs, many of these registers can be combined using a unique identification number. In education, one can connect, for instance, education results and outcomes with socio‑economic background (e.g. based on tax and income, unemployment levels of parents/households) and study pathways.
Source: Research Centre for Education and the Labour Market (n.d.[104]), The Netherlands Cohort Study on Education, https://www.roa.nl/research/research-projects/netherlands-cohort-study-education (accessed on 22 November 2023) and Statistics Sweden (n.d.[105]), Mikrodata från individregister [Microdata from individual registers], https://www.scb.se/vara-tjanster/bestall-data-och-statistik/bestalla-mikrodata/vilka-mikrodata-finns/individregister/ (accessed on 22 November 2023).
That said, the DoE has already examined various data sources and methodologies to capture socio‑economic disadvantage better (Department of Education, 2022[106]). Thanks to these considerations, the DEIS identification model was refined to consider Traveller and Roma students, students residing in International Protection Accommodation Services centres, Emergency Orientation and Reception Centres, and those experiencing homelessness (Chapter 1). The DoE also examined the impact of crime and the needs of students for whom English is an additional language as additional data for inclusion in the DEIS identification model (ibid.).
Improve monitoring by utilising standardised assessments
Improving monitoring by utilising standardised assessments can yield several benefits. First, unlike international assessments operating on fixed timelines, national tests can be administered on more flexible schedules (Gilleece et al., 2020[61]). Second, national assessments enable benchmarking against national standards. This approach provides a valuable complement to international large-scale assessments, allowing for a more nuanced understanding of, e.g. DEIS achievement concerning national benchmarks. Standardised assessments are suitable for benchmarking the achievements of DEIS schools with national norms, given that national standardised assessments are normed to the Irish population (ibid.).
Indeed, the policy focus, as exemplified by, for example, DEIS targets in the “National Strategy: Literacy and Numeracy for Learning and Life 2011-2020” (Department of Education and Skills, 2017[6]), has been on narrowing the achievement gap between DEIS and non-DEIS schools. However, given the broader contextual and socio‑economic context, closing the overall DEIS achievement gap would be an “extremely ambitious agenda as it would mean reducing overall differences in educational outcomes between social class groups within and between schools” (Smyth, McCoy and Kingston, 2015, p. 76[56]). Aiming to reduce the adverse effects of the concentration of disadvantage in schools might be a more reasonable goal (Smyth, McCoy and Kingston, 2015[56]). That could require, for instance, changing the targets for specific socio‑economic groups within schools and, as such, large‑scale data that would allow for robust monitoring over time. Currently, this cannot be facilitated with international large-scale assessments (e.g. fluctuations in the number of DEIS schools over cycles) and can only partially be taken on board with national sample‑based large‑scale assessments (e.g. DEIS Rural schools are often not part of the samples).
Standardised assessments can also help to measure effectiveness. There are many ways to measure school and system effectiveness. One quantitative approach is through value-added modelling, the statistical technique used to assess the impact of a school (or teacher) on students’ academic progress over time. It aims to isolate the contribution of the educational environment by analysing changes in students’ achievement scores over time while accounting for factors such as prior performance, demographics and other contextual influences. Thus, value-added approaches try to isolate the school’s contribution to student learning from other factors associated with student learning, such as students’ socio-economic background (OECD, 2008[107]). Education systems often adopt value‑added to devise a more realistic measure of a school’s performance and ignore factors that are more or less beyond the school’s control, such as differences in student composition or “random noise” (ibid.). Value-added models can also be used to focus attention on particular groups of students that are found to be low- or high‑performing. Value-added scores do not necessarily need to be featured in league tables (after all, Ireland does not publish results from standardised assessments either), and they can be helpful for the internal and external evaluation of schools (Box 6.2).
Box 6.2. The use of value-added measures in England (United Kingdom)
In England (United Kingdom), value-added scores (Progress 8) and other performance data and inspection reports are summarised for schools, local authorities, inspectors, dioceses, academy trusts and governors in the Analyse School Performance portal. It enables schools to analyse performance data in greater depth as part of the self-evaluation process, provides support with teaching and learning, and is the portal for inspectors to learn about school performance before an inspection visit. This interactive software enables schools and school inspectors to analyse the value‑added information to, for example, identify the value-added scores of students in particular subjects, at specific year levels and of specific student groups (e.g. socio-economically disadvantaged). Moreover, it provides users with the distribution of student value-added scores as well as other features. It can help users better understand where the school is successful and where improvement is needed.
Source: Ofsted (2023[108]), School inspection data summary report (IDSR) guide, https://www.gov.uk/guidance/school-inspection-data-summary-report-idsr-guide (accessed on 27 November 2023).
Value-added models are not without challenges. Critics argue that the overreliance on standardised testing narrows the educational focus, neglecting critical aspects of student development and creating incentives for “teaching to the test” (Rubin, Stuart and Zanutto, 2004[109]). They can also be sensitive to variations in test difficulty, small sample sizes and year-to-year variability, leading to unpredictable changes in evaluations (Everson, 2016[110]). In Ireland, some consideration has been given to the use of value-added models, with limitations noted in regard to the availability of data and other issues (Gilleece, 2014[111]; Sloane, Oloff-Lewis and Kim, 2013[112]). Nevertheless, value-added modelling was implemented on a sample of post‑primary schools (Doris, O’Neill and Sweetman, 2022[113]). It was found that while there was a considerable overlap in the ranking of schools based on raw performance scores and their value-added, some schools would move significantly in the overall ranking (ibid.). Moreover, while overrepresented among the highest-performing schools, fee-paying schools did not perform equally well when considering their value-added (ibid.).
Using assessment data could also shed more light on students’ experiences progressing from DEIS primary to non‑DEIS post‑primary schools. The OECD review team learned that some transitions between DEIS primary and non‑DEIS post-primary schools can present complex challenges for students and their families (particularly as they often lose access to an HSCL Coordinator). The discontinuity in support, such as removing HSCL Coordinators during progression, can have significant implications for families, leaving them without a vital source of assistance. Large‑scale longitudinal data could focus on these critical transitions, provide valuable insights into challenges faced by students in different school contexts, and inform the development of targeted policies to support smoother transitions and mitigate potential disruptions.
Finally, access to longitudinal datasets with assessment data could stimulate research into the effects of the DEIS label on students and teachers. Two potential forces can influence parental enrolment decisions in DEIS schools. On the one hand, given the increased intensity of resources, it can be an attractive option, particularly for parents with children with more complex needs. On the other hand, the label can have a stigma attached to it. The perceived challenges and negative connotations can influence parental decisions and teachers’ willingness to work in such schools. Indeed, qualitative research with a small sample of principals suggests that there is a “misunderstanding in society about what it means to be a DEIS school” (Barry et al., 2022, p. 9[67]). During the OECD review visit, some stakeholders suggested that DEIS is now being viewed mostly positively by parents. Other stakeholders mentioned that some DEIS schools have substantial challenges recruiting staff, which could affect student performance. Other concrete evidence of the consequences of this DEIS labelling is not available. Access to comprehensive data could shed more light on the evolving nature of the DEIS label’s impact, examining enrolment trends and performance development in DEIS and non-DEIS schools. These research proposals should be viewed in light of the recommendation below on facilitating research that could provide information on the causal effects of the DEIS programme.
However, the use of standardised assessments in the Irish context is not without challenges. For instance, in primary schools, the DoE would need to address the fact that schools are free to choose providers of standardised testing, which may not be comparable (Gilleece and Clerkin, 2024[89]). The scale and scope of this issue need to be explored further to evaluate to what extent data are incomparable and heterogeneous. Furthermore, data from standardised assessments in primary education are not returned to the DoE at the individual level, making value-added modelling impossible in the standard sense (should this practice be preserved).
Moreover, while quantitative analyses can provide valuable insights into school performance, a balanced approach, incorporating qualitative sources, such as external evaluations and school self‑evaluations, is needed, to describe a comprehensive picture (Gilleece, 2014[111]; OECD, 2013[2]). At the same time, however, quantifying errors associated with qualitative judgments and the potential subjectivity of such evaluations cannot be done in qualitative approaches (Gilleece, 2014[111]). Thus, a better use of administrative data, complemented by qualitative insights, can shed light on nuances and idiosyncrasies in the school system, facilitating a more holistic understanding of achievement differences between schools.
Promote research that could provide more information on the causal effects of the DEIS programme
Considerable work has been done in Ireland in developing practical guidelines to support high‑quality evaluation (Department of Children and Youth Affairs, 2019[114]; Department of Children, Equality, Disability, Integration and Youth, 2021[115]; Department of Children, Equality, Disability, Integration and Youth, 2023[116]; Department of Children, Equality, Disability, Integration and Youth, 2021[117]; Gilleece and Clerkin, 2024[89]). In line with these and by collecting more data at the individual student level and gaining access to a broader range of student and household characteristics, it might be possible for researchers to use a range of statistical techniques that can provide more information on the causal mechanisms of the DEIS programme without conducting randomised controlled trials. These include regression discontinuity, synthetic cohort matching and difference‑in‑differences. In Ireland, the benefits of these approaches have been identified by the ERC (Gilleece and Clerkin, 2024[89]).
Regression discontinuity design is a quasi-experimental method that exploits a discontinuity in the data by dividing the studied population into treatment and control groups based on whether participants fall above or below a specified threshold or cut-off point (European Commission, Directorate-General for Education, Youth, Sport and Culture, 2022[90]). These cut-off points can relate to, e.g. a minimum score in an examination that allows progression to the next educational level, or a score that determines participation in a programme. The underlying assumption is that individuals just above or below the threshold are similar, allowing for attributing the observed differences in outcomes to the effect of the programme under examination (Lee and Lemieux, 2010[118]). Another assumption is that individuals do not have control over whether they participate in the programme. The regression discontinuity approach assesses the programme’s effect by comparing the performance of the target group (just above the threshold) with that of the control group (just below the threshold). It can estimate the average treatment effect on the treated (Box 6.3). However, a limitation of this methodology is that the programme’s impact can only be attributed to those just above and below the cut-off point, preventing a comprehensive assessment of the overall effect on all participants (ibid.). Given that the DEIS programme has a specific school‑level cut-off point for new entrants to the programme, this technique could be explored as one that could estimate the programme’s effects on school outcomes. Indeed, Gilleece, Flannery and Clerkin (Forthcoming[119]) aim to examine schools with similar levels of socio‑economic disadvantage and compare those that received additional supports under the DEIS programme to those that did not using regression discontinuity. The authors plan to measure the impacts of the DEIS programme on post‑primary school‑average Junior Certificate achievement and retention outcomes from 2007 to 2016 (ibid.).
Box 6.3. Regression discontinuity to estimate returns to education quality for low-skilled students
Canaan and Mouganie focused on understanding the labour market returns to higher education quality for low‑skilled students in France. They used a regression discontinuity design to compare students who had marginally passed and failed the French upper secondary exit exam (baccalauréat général) on the first attempt. They exploited the natural threshold (the passing score of 10 points), allowing for a quasi‑experimental comparison of students with similar scores just below and above the cut-off but differing access to higher education institutions.
The authors recognised potential concerns with the regression discontinuity methodology. One is the possible manipulation of scores by students around the threshold. However, this is improbable given that the exam is in an essay format, making it unlikely for students to control their grades. Another concern is for graders and administrators to sort students below or above the threshold. However, given that the exams are anonymised, it is implausible for initial test scores to be strategically manipulated. The researchers also conducted several statistical robustness checks.
Students who had marginally passed the exam were more likely to enrol in science, technology, engineering and mathematics fields, and attend higher education institutions with better peer quality without affecting the quantity of education pursued. The findings also indicate a 12.5% increase in earnings for these students at ages 27 and 29, with no significant effect on employment rates. The study concludes that access to higher-quality post‑secondary education significantly raises earnings for low-skilled students.
Source: Canaan and Mouganie (2018[120]), Returns to Education Quality for Low-Skilled Students: Evidence from a Discontinuity, http://dx.doi.org/10.1086/694468.
Synthetic cohort matching is a method to estimate causal effects by creating a comparison group that resembles the treatment group in observed characteristics. For instance, each student who is part of a programme under evaluation (or attends a school that is part of a programme) is matched, based on observable characteristics such as performance, socio‑economic background, demographics, etc., to a counterpart in the data who is not part of the programme (“synthetic control group”) (Box 6.4). The central assumption of this method is that the matched students are indistinguishably similar, not only based on the observable but also on unobservable characteristics (assumed to be highly correlated with the observables). These can include motivation in studying, talent and skills that are rarely available (European Commission, Directorate-General for Education, Youth, Sport and Culture, 2022[90]). If the DoE adopts some of the policy recommendations above, namely broadening the pool of administrative data and getting access to a wide range of observable characteristics of students, there may be merit in considering how this technique could be used to estimate the effects of the DEIS programme on student outcomes.
Box 6.4. Synthetic cohort matching to estimate engagement between online and face‑to‑face learners
Paulsen and McCormick used cohort matching to compare student engagement levels of online learners, face‑to‑face learners and dual‑mode learners who took online and face-to-face courses in US higher education in 2015. They matched students based on their observable characteristics, such as age, gender, race, study field and enrolment status. By matching students who are similar on these characteristics, the authors aimed to isolate the effect of modality on student engagement and reduce the bias caused by the differences in the student populations. The results suggest that online learning for those particular students did not have a negative impact on most aspects of student engagement, except for collaborative learning and interaction with faculty. However, contrary to studies that did not use cohort matching techniques, they also found that the differences in supportive environment and learning strategies between online and face-to-face learners were mainly due to the different characteristics of the two groups, such as age, work and family responsibilities.
Source: Paulsen and McCormick (2020[121]), Reassessing Disparities in Online Learner Student Engagement in Higher Education, https://doi.org/10.3102/0013189X19898690.
Finally, the difference-in-differences (DID) method involves selecting two groups or areas, a treated and a control group, and comparing their outcomes before and after a programme, practice or policy implementation (Box 6.5). The standard key assumption for the DID method is the common trend assumption, positing that the treated and control groups would have evolved similarly without the intervention (programme, practice or policy). The DID approach then compares the pre- and post-treatment levels of an outcome variable in the two groups, allowing for an assessment of the programme’s overall impact (European Commission, Directorate-General for Education, Youth, Sport and Culture, 2022[90]). Compared to regression discontinuity, DID assesses the overall impact of treatment and not just local effects. DID can also be combined with synthetic cohort matching if researchers can access a wide range of panel data (Arkhangelsky et al., 2021[122]). Indeed, combining DID and synthetic cohort matching could be a feasible methodology for getting closer to the causal mechanisms behind the DEIS programme, assuming that the necessary assumptions are met and the relevant data are available, although, as previously noted, there are some limitations with data currently accessible in Ireland which may restrict analytical opportunities.
Box 6.5. Difference-in-differences to estimate the effectiveness of modular education on early school leaving rates
Mazrekaj and De Witte evaluated the effectiveness of modular education on early school leaving rates in the Flemish Community of Belgium using a difference-in-differences methodology. Modular education is a system where conventional courses are divided into smaller modules. The researchers use a difference-in-differences framework with diverse adoption dates per school to explain a policy change that introduced modular education for only some study programmes. The results indicate that modular education reduced early school leaving rates in vocational education and training by 2.5 percentage points, with the most substantial effects observed among students with an immigrant background. Furthermore, students enrolled in modular education were more likely to be employed and earn higher wages.
Source: Mazrekaj and De Witte (2019[123]), The effect of modular education on school dropout, https://doi.org/10.1002/berj.3569.
Ensuring the quality and transparency of evaluation processes requires a multifaceted approach, combining quantitative and qualitative methodologies. As mentioned above, this is one of the strengths of the education system and should be preserved. While quantitative evaluations provide numerical insights, qualitative evaluations are essential for a comprehensive understanding of policy contexts, aiding in identifying aspects to be measured and evaluated. Moreover, qualitative approaches enable the analysis of policy implementation processes, shedding light on the roles of various actors, and contributing to a nuanced understanding of why certain policies are successful or supported in specific contexts (European Commission, Directorate-General for Education, Youth, Sport and Culture, 2022[90]). Thus, integrating diverse evaluation and assessment instruments, encompassing quantitative and qualitative methods, is necessary to trace pathways to quality and training, offering policy makers and education institutions valuable and meaningful insights (ibid.).
Strengthen the use of data at the school level
Strengthening the use of data at the school level is paramount for informed decision making and effective policy implementation. The analysis of baseline data, including input from teachers, parents and students, along with formative and summative assessments, plays a crucial role in setting measurable targets and identifying expected outcomes for students and the school. In Ireland, despite the emphasis on data utilisation through the DEIS action planning process, some schools still face challenges in analysing this information and formulating SMART targets that are both meaningful and realistic within their specific contexts (Department of Education, 2022[78]; Department of Education, 2022[38]). Setting SMART targets and understanding the underlying data remain particularly challenging for over a third of inspected DEIS schools, indicating a need for further support and guidance (ibid.).
The challenges persist due to uncertainties in regard to the use and analysis of data, and the monitoring and evaluation of the targets set. Schools require additional guidance, particularly from Oide, to enhance their capacity in these aspects (Department of Education, 2022[78]). Professional development activities in this area should be highly applicable, and ideally, participants should use data that are regularly available to them. Moreover, working in teams with other school staff members is a promising strategy for implementing data use in schools (Schildkamp et al., 2019[124]). To this end, the professional development activities could encourage team participation (Box 6.6).
Box 6.6. Practitioner data use in schools: workshop toolkit in the United States
The Practitioner Data Use Workshop is a targeted toolkit designed by the National Center for Education Evaluation and Regional Assistance (US Department of Education) to enhance educators’ capabilities in collaborative, data-driven inquiry and instructional decision making. The workshop introduces participants to the fundamental concept and process of the data inquiry cycle, facilitating hands-on practice. The toolkit offers a comprehensive resource for each step of the data inquiry cycle, providing activities, materials and critical points for each workshop segment.
A critical aspect of the workshop’s design is the recommendation for participants to attend in school teams. This collaborative approach fosters a supportive network of practitioners using data to inform instruction. The workshop can be adapted to accommodate teams within a single school, teams from multiple schools within a region, or teams from various schools across numerous areas. Teams are encouraged to bring their datasets, including student performance on standardised assessments, disaggregated by domains and other relevant metrics showing changes over time.
The learning goals of the workshop are comprehensive, aiming for participants to become familiar with an inquiry framework for interpreting data, engage in a protocol to analyse data, identify root causes for student learning challenges, and develop learning goals and action plans. Participants are expected to leave the workshop with a specific data plan and a process applicable to their educational contexts.
Source: Bocala et al. (2014[125]), Practitioner Data Use in Schools: Workshop Toolkit, https://ies.ed.gov/ncee/rel/regions/northeast/pdf/REL_2015043.pdf (accessed on 22 November 2023).
Improving the clarity on the interface between school self-evaluation and DEIS action planning was also identified as essential for facilitating school improvement (Department of Education, 2022[78]). A comprehensive approach is needed to address these challenges, involving on-going professional development for teachers and principals, clear guidance on setting SMART targets, and improved coordination between self-evaluation and action-planning processes. Quality of leadership is often tied to the principals' use of data. Indeed, action planning for improvement, analysis of baseline data and information available to the school, including the views and perspectives of the teachers, parents and students, setting targets, whole-school implementation, and regular monitoring and reviewing actions were identified as key indicators of highly effective leadership in DEIS schools by the Inspectorate (ibid.).
Improving the use of data in schools should not only involve training for current and prospective principals (as is now offered by Oide) but also external supports provided at a higher level to multiple schools. Supporting the digital transformation of schools needs to involve a careful reflection on the types of resources and services that are best provided to schools at scale or by qualified external staff (OECD, 2023[126]). Ireland’s “Digital Strategy for Schools to 2027” already identified some ways to strengthen digital capacity around schools by, e.g. proposing regional panels of approved providers to offer technical support and advice to schools as well as a longer-term perspective of providing centralised high-quality technical and maintenance support to schools (Department of Education, 2022[10]). This vital step in the right direction should be accompanied by a reflection on how the regional or central level can support schools in accessing and analysing data to help them in their self-evaluation and improvement planning process.
Some OECD countries have established local centres of expertise or maintain centrally coordinated networks of experts who can be dispatched to build capacity at the local level and support schools with needs related to the use of digital resources or the analysis of data (Box 6.7). Similar structures could provide a means in Ireland to provide schools with robust analyses of their assessments and other data based on their needs, while ensuring the protection of privacy and maintenance of analytical rigour.
Box 6.7. Strengthening schools’ digital capacity through regional and local expertise in France
A network of local digital advisors in France has supported local authorities in implementing digital education technologies since 2013. The advisors provide support on digital matters to the rectors of France’s 30 education academies (or administrative districts), liaise with local authorities and companies, lead initiatives, and facilitate networks around the uses of digital tools in education. The advisors also develop training programmes and mobilise knowledge for teachers to become more active in the use of digital tools for learning. Each academy has at least one digital education advisor, with most having less than 15, totalling several hundred advisors. In co-operation with the Directorate for Digital Education, this strong network of skilled experts could be mobilised to prepare and oversee the transition to remote learning during the COVID-19 pandemic (OECD, 2023[126]; Vincent-Lancrin, Cobo Romaní and Reimers, 2022[127]).
Source: Adapted from OECD (2023[126]), Shaping Digital Education: Enabling Factors for Quality, Equity and Efficiency, https://doi.org/10.1787/bac4dc9f-en.
References
[39] Archer, P. and F. Shortt (2003), Review of the Home/School/Community Liaison Scheme, https://www.erc.ie/documents/hsclreview03.pdf (accessed on 15 April 2024).
[122] Arkhangelsky, D. et al. (2021), “Synthetic Difference-in-Differences”, American Economic Review, Vol. 111/12, pp. 4088-4118, https://doi.org/10.1257/aer.20190159.
[42] AsIAm (2019), Invisible Children, https://asiam.ie/wp-content/uploads/2023/01/Invisible-Children-Survey-on-School-Absence-Withdrawl-in-Irelands-Autism-Community-April-2019-1.pdf (accessed on 27 November 2023).
[83] Balestra, C. and L. Fleischer (2018), “Diversity statistics in the OECD: How do OECD countries collect data on ethnic, racial and indigenous identity?”, OECD Statistics Working Papers, No. 2018/09, OECD Publishing, Paris, https://doi.org/10.1787/89bae654-en.
[67] Barry, G. et al. (2022), “School self-evaluation and empowering leadership in DEIS schools: an exploration of success”, Irish Educational Studies, pp. 1-18, https://doi.org/10.1080/03323315.2022.2135569.
[125] Bocala, C. et al. (2014), Practitioner Data Use in Schools: Workshop Toolkit, National Center for Education Evaluation and Regional Assistance, https://ies.ed.gov/ncee/rel/regions/northeast/pdf/REL_2015043.pdf (accessed on 22 November 2023).
[68] Brady, A. (2019), “Anxiety of performativity and anxiety of performance: self-evaluation as bad faith”, Oxford Review of Education, Vol. 45/5, pp. 605-618, https://doi.org/10.1080/03054985.2018.1556626.
[120] Canaan, S. and P. Mouganie (2018), “Returns to Education Quality for Low-Skilled Students: Evidence from a Discontinuity”, Journal of Labor Economics, Vol. 36/2, pp. 395-436, https://doi.org/10.1086/694468.
[101] Chaudry, A. and C. Wimer (2016), “Poverty is Not Just an Indicator: The Relationship Between Income, Poverty, and Child Well-Being”, Academic Pediatrics, Vol. 16/3, pp. S23-S29, https://doi.org/10.1016/j.acap.2015.12.010.
[86] Citizens Information (2023), Medical cards, https://www.citizensinformation.ie/en/health/medical-cards-and-gp-visit-cards/medical-card/ (accessed on 8 December 2023).
[50] Delaney, E. et al. (2023), PIRLS 2021: Reading results for Ireland, Educational Research Centre, https://www.erc.ie/wp-content/uploads/2023/05/PIRLS-2021_Reading-Results-for-Ireland.pdf (accessed on 27 May 2024).
[114] Department of Children and Youth Affairs (2019), Evaluating Government-Funded Human Services: Evidence into Policy Guidance Note #3, https://assets.gov.ie/27089/71d187ec8961432aa86d91a3976d2743.pdf (accessed on 5 April 2024).
[116] Department of Children, Equality, Disability, Integration and Youth (2023), Managing Evaluation Challenges, https://assets.gov.ie/279057/44b58e17-102a-4a88-90be-bc0d59bb2e73.pdf (accessed on 5 April 2024).
[115] Department of Children, Equality, Disability, Integration and Youth (2021), Frameworks for Policy, Planning and Evaluation, https://www.gov.ie/en/publication/5a620-frameworks-for-policy-planning-and-evaluation-evidence-into-policy-guidance-note-7/ (accessed on 5 April 2024).
[117] Department of Children, Equality, Disability, Integration and Youth (2021), Policy-Relevant Research Design: Picking Your Method, https://assets.gov.ie/279057/44b58e17-102a-4a88-90be-bc0d59bb2e73.pdf (accessed on 5 April 2024).
[97] Department of Children, Equality, Disability, Integration and Youth (2017), National Traveller and Roma Inclusion Strategy 2017 – 2021, https://www.gov.ie/pdf/?file=https://assets.gov.ie/43310/d7d54fbff0f4418982856e7dddaf78c1.pdf#page=null (accessed on 5 April 2024).
[30] Department of Education (2024), A Guide to Early Years Education Inspection, https://assets.gov.ie/233708/a923cf1c-6565-48d8-a96b-2b1330e70b14.pdf (accessed on 4 April 2024).
[14] Department of Education (2024), OECD Review of resourcing schools to address educational disadvantage: Country Background Report Ireland, Department for Education, https://s3-eu-west-1.amazonaws.com/govieassets/296017/4d1ac422-5475-470e-b910-4c80a83c43bc.pdf.
[32] Department of Education (2023), Circular 0034/2023: DEIS (Delivering Equality of Opportunity in Schools) Action Planning and Grant Allocation for All DEIS Schools, https://www.gov.ie/pdf/?file=https://assets.gov.ie/262734/6df1e978-c5d2-4dc1-89da-02ccd216f304.pdf#page=null (accessed on 23 April 2024).
[4] Department of Education (2023), Department of Education Statement of Strategy, 2023-2025, Department of Education, https://www.gov.ie/en/publication/d7691-department-of-education-statement-of-strategy-2023-2025/ (accessed on 27 November 2023).
[59] Department of Education (2023), Guidelines on the appropriate use of the Attendance Campaign Support Grant for Primary and Post-Primary Schools, https://www.gov.ie/pdf/?file=https://assets.gov.ie/274562/4add5938-d744-488a-a6d2-503ba92e21f0.pdf#page=null (accessed on 23 April 2024).
[29] Department of Education (2023), Inspection Reports, https://www.gov.ie/en/publication/b9e7d3-inspection-reports/ (accessed on 27 November 2023).
[80] Department of Education (2023), Inspectorate, https://www.gov.ie/en/organisation-information/818fa1-inspectorate/ (accessed on 27 November 2023).
[73] Department of Education (2023), Looking at our School 2022, https://www.gov.ie/en/publication/b1bb3-looking-at-our-school-2022/ (accessed on 27 November 2023).
[75] Department of Education (2023), National Educational Psychological Service (NEPS) resources and publications, https://www.gov.ie/en/collection/97aa18-national-educational-psychological-service-neps-resources-and-public/ (accessed on 5 April 2024).
[60] Department of Education (2023), National School Attendance Campaign 2023, https://www.gov.ie/pdf/?file=https://assets.gov.ie/271473/7898d032-b275-45e0-9de2-89edece393a3.pdf#page=null (accessed on 23 April 2024).
[99] Department of Education (2023), Pupils from the Traveller Community, https://www.gov.ie/pdf/?file=https://assets.gov.ie/258672/64703960-5409-4314-a829-5e6b6603018b.pdf#page=null (accessed on 5 December 2023).
[98] Department of Education (2023), Pupils from the Traveller Community 2016-20, https://www.gov.ie/pdf/?file=https://assets.gov.ie/258672/64703960-5409-4314-a829-5e6b6603018b.pdf#page=null (accessed on 30 January 2024).
[17] Department of Education (2023), Retention, https://www.gov.ie/en/collection/retention/?referrer=http://www.education.ie/en/Publications/Statistics/retention/ (accessed on 27 November 2023).
[65] Department of Education (2023), Retention rates of pupils in second-level schools – Entry cohort 2016, https://www.gov.ie/pdf/?file=https://assets.gov.ie/272338/d91a62d5-3550-4df0-858c-e7602d67aa7e.pdf#page=null (accessed on 29 January 2024).
[11] Department of Education (2023), Traveller and Roma Education Strategy, Department of Education, https://www.gov.ie/en/consultation/2545f-traveller-and-roma-education-strategy/ (accessed on 27 November 2023).
[5] Department of Education (2022), Annual Reports of Department of Education, https://www.gov.ie/en/collection/department-of-education-and-skills-annual-reports/ (accessed on 27 November 2023).
[38] Department of Education (2022), Chief Inspector’s Report September 2016-December 2020, https://www.gov.ie/pdf/?file=https://assets.gov.ie/232560/fac408b3-689b-44cb-a8f1-3cb090018a05.pdf#page=null (accessed on 22 December 2023).
[10] Department of Education (2022), Digital Strategy for Schools to 2027, Department of Education, https://www.gov.ie/en/publication/69fb88-digital-strategy-for-schools/ (accessed on 27 November 2023).
[78] Department of Education (2022), Looking at DEIS Action Planning for Improvement in Primary and Post-Primary Schools, Department of Education, https://www.gov.ie/pdf/?file=https://assets.gov.ie/226977/cd9c8a0a-9374-4085-806e-f81be6a2081d.pdf#page=null (accessed on 27 November 2023).
[27] Department of Education (2022), Looking at Our School 2022: A Quality Framework for Post-Primary schools, https://assets.gov.ie/25261/c97d1cc531f249c9a050a9b3b4a0f62b.pdf (accessed on 27 May 2024).
[28] Department of Education (2022), Looking at Our School 2022: A Quality Framework for Primary Schools and Special Schools, https://assets.gov.ie/25260/4a47d32bf7194c9987ed42cd898e612d.pdf (accessed on 5 April 2024).
[16] Department of Education (2022), Minister Foley confirms waiving of fees for 2022 State Examinations, https://www.gov.ie/en/press-release/806cf-minister-foley-confirms-waiving-of-fees-for-2022-state-examinations/#:~:text=The%20Minister%20for%20Education%20Norma,and%20Junior%20Cycle%20examinations%20respectively. (accessed on 15 April 2024).
[31] Department of Education (2022), School Self-Evaluation: Next Steps September 2022-June 2026, Department of Education, https://assets.gov.ie/232734/3e6ca885-96ec-45a6-9a08-3e810b7cd1ea.pdf (accessed on 27 November 2023).
[106] Department of Education (2022), The Refined DEIS identification model, https://assets.gov.ie/220043/d6b98002-a904-427f-b48a-0fa0af756ea7.pdf (accessed on 29 February 2024).
[36] Department of Education (2021), Rutland Street Pre-School Project, https://www.gov.ie/en/service/5a426-rutland-street-project/ (accessed on 17 January 2024).
[79] Department of Education (2018), SSE Update: Primary Edition - Issue 11, https://assets.gov.ie/195446/2b155271-b7f8-481f-b1e8-2dbcaeddf271.pdf (accessed on 27 November 2023).
[74] Department of Education and Skills (2019), Wellbeing Policy Statement and Framework for Practice, https://assets.gov.ie/24725/07cc07626f6a426eb6eab4c523fb2ee2.pdf (accessed on 27 November 2023).
[6] Department of Education and Skills (2017), National Strategy: Literacy and Numeracy for Learning and Life 2011-2020, Department of Education and Skills, https://assets.gov.ie/24960/93c455d4440246cf8a701b9e0b0a2d65.pdf (accessed on 27 November 2023).
[9] Department of Education and Skills (2011), National Strategy for Higher Education to 2030, Department of Education and Skills.
[82] Department of Education and Skills (n.d.), Report on the Review of DEIS, https://www.gov.ie/pdf/?file=https://assets.gov.ie/230369/44ce7126-6486-4f78-9e37-d617390d922a.pdf#page=null (accessed on 24 April 2024).
[12] Department of Housing, Local Government and Heritage (2022), Housing for All Youth Homelessness Strategy 2023-2025, https://assets.gov.ie/239255/99c987df-4439-4bc2-8730-be614eae1e2e.pdf (accessed on 27 November 2023).
[7] Donohue, B. et al. (2023), Education in a Dynamic World: the performance of students in Ireland in PISA 2022, Educational Research Centre, https://www.erc.ie/wp-content/uploads/2023/12/B23617-Education-in-a-Dynamic-World-Report-online-1.pdf (accessed on 17 June 2024).
[113] Doris, A., D. O’Neill and O. Sweetman (2022), “Good schools or good students? The importance of selectivity for school rankings”, Oxford Review of Education, Vol. 48/6, pp. 804-826, https://doi.org/10.1080/03054985.2022.2034611.
[103] Downes, P., S. Pike and S. Murphy (2020), Opening Statement Submission to the Oireachtas Joint Committee on Education, Further and Higher Education, Research, Innovation and Science.
[55] Duggan, A. et al. (2023), “Trends in educational inequalities in Ireland’s primary schools: an analysis based on TIMSS data (2011–2019)”, Large-scale Assessments in Education, Vol. 11/1, https://doi.org/10.1186/s40536-023-00188-2.
[84] Durante, F., C. Volpato and S. Fiske (2009), “Using the stereotype content model to examine group depictions in Fascism: An archival approach”, European Journal of Social Psychology, pp. n/a-n/a, https://doi.org/10.1002/ejsp.637.
[41] Educate Together (2023), Evaluation of the Nurture Schools Project, https://www.educatetogether.ie/app/uploads/2023/11/Final-Nurture-Evaluation-1.pdf (accessed on 27 November 2023).
[46] Eivers, E. and A. Clerkin (eds.) (2013), Understanding achievement in PIRLS and TIMSS 2011, Educational Research Centre.
[81] Eivers, E. et al. (2010), The 2009 National Assessments of Mathematics and English Reading, Department of Education and Skills.
[90] European Commission, Directorate-General for Education, Youth, Sport and Culture (2022), Investing in our future – Quality investment in education and training, Publications Office of the European Union, https://doi.org/10.2766/45896.
[70] European Commission/EACEA/Eurydice (2015), Assuring Quality in Education: Policies and Approaches to School Evaluation in Europe, Publications Office of the European Union, https://data.europa.eu/doi/10.2797/678.
[110] Everson, K. (2016), “Value-Added Modeling and Educational Accountability”, Review of Educational Research, Vol. 87/1, pp. 35-70, https://doi.org/10.3102/0034654316637199.
[63] Flannery, D., L. Gilleece and J. Clavel (2023), “School socio-economic context and student achievement in Ireland: an unconditional quantile regression analysis using PISA 2018 data”, Large-scale Assessments in Education, Vol. 11/1, https://doi.org/10.1186/s40536-023-00171-x.
[51] Gilleece, L. (2015), “Parental involvement and pupil reading achievement in Ireland: Findings from PIRLS 2011”, International Journal of Educational Research, Vol. 73, pp. 23-36, https://doi.org/10.1016/j.ijer.2015.08.001.
[111] Gilleece, L. (2014), “Understanding achievement differences between schools in Ireland – can existing data-sets help?”, Irish Educational Studies, Vol. 33/1, pp. 75-98, https://doi.org/10.1080/03323315.2013.877220.
[89] Gilleece, L. and A. Clerkin (2024), “Towards more robust evaluation of policies and programmes in education: identifying challenges in evaluating DEIS and Reading Recovery”, Irish Educational Studies, pp. 1-29, https://doi.org/10.1080/03323315.2024.2334704.
[119] Gilleece, L., D. Flannery and A. Clerkin (Forthcoming), “A regression discontinuity analysis of retention and Junior Certificate achievement in DEIS and non-DEIS schools”.
[102] Gilleece, L. and G. McHugh (2022), “Validating school-based measures of educational disadvantage in Ireland”, Education Policy Analysis Archives, Vol. 30, https://doi.org/10.14507/epaa.30.7245.
[23] Gilleece, L. and S. Nelis (2023), Ireland’s 2021 National Assessments of Mathematics and English Reading: Exploring the home backgrounds, classrooms and schools of pupils in Urban DEIS schools, https://www.erc.ie/wp-content/uploads/2023/12/NAMER-21-DEIS-Context-Report-Final-Online.pdf (accessed on 23 April 2024).
[61] Gilleece, L. et al. (2020), Reading, mathematics and science achievement in DEIS schools: Evidence from PISA 2018, Educational Research Centre.
[72] Godfrey, D. (2020), “From External Evaluation, to School Self-evaluation, to Peer Review”, in School Peer Review for Educational Improvement and Accountability, Accountability and Educational Improvement, Springer International Publishing, Cham, https://doi.org/10.1007/978-3-030-48130-8_1.
[88] Golden, G. (2020), “Education policy evaluation: Surveying the OECD landscape”, OECD Education Working Papers, No. 236, OECD Publishing, Paris, https://doi.org/10.1787/9f127490-en.
[34] Houses of the Oireachtas (2019), Report on Education inequality & disadvantage and Barriers to Education, Houses of the Oireachtas, https://data.oireachtas.ie/ie/oireachtas/committee/dail/32/joint_committee_on_education_and_skills/reports/2019/2019-06-05_report-on-education-inequality-disadvantage-and-barriers-to-education_en.pdf (accessed on 27 November 2023).
[91] INTO and Educational Disadvantage Centre (2015), Review of DEIS: Poverty and Social Inclusion in Education, https://www.into.ie/app/uploads/2019/07/ReviewofDeis.pdf (accessed on 5 December 2023).
[43] ISSU (2023), Leaving Certificate Applied Reform Report 2023, Second-Level Students’ Union, https://static1.squarespace.com/static/5d36029ba09e370001fa2248/t/649ac101f162595cc2651b00/1687863637101/_ISSU+LCA+Reform+Report.pdf (accessed on 27 November 2023).
[53] Karakolidis, A. et al. (2021), “Educational inequality in primary schools in Ireland in the early years of the national literacy and numeracy strategy: An analysis of national assessment data”, Irish Journal of Education, Vol. 44/1, pp. 1-24, https://www.erc.ie/wp-content/uploads/2021/07/Karakolidis-et-al.-2021-IJE.pdf (accessed on 6 December 2023).
[54] Karakolidis, A. et al. (2021), “Examining educational inequalities: insights in the context of improved mathematics performance on national and international assessments at primary level in Ireland”, Large-scale Assessments in Education, Vol. 9/1, https://doi.org/10.1186/s40536-021-00098-1.
[47] Kavanagh, L. and S. Weir (2018), The evaluation of DEIS: the lives and learning of urban primary school pupils, 2007-2016, Educational Research Centre.
[48] Kavanagh, L., S. Weir and E. Moran (2017), The evaluation of DEIS: monitoring achievement and attitudes among urban primary school pupils from 2007 to 2016, Educational Research Centre.
[18] Kelleher, C. and S. Weir (2017), The impact of DEIS on the size of junior classes in urban primary schools in 2014/15 with comparative data from 2009/10, Educational Research Centre, https://www.erc.ie/wp-content/uploads/2017/03/The-Impact-of-DEIS-on-Class-Size-in-Primary-Schools-31.03.2017.pdf (accessed on 27 November 2023).
[118] Lee, D. and T. Lemieux (2010), “Regression Discontinuity Designs in Economics”, Journal of Economic Literature, Vol. 48/2, pp. 281-355, https://doi.org/10.1257/jel.48.2.281.
[123] Mazrekaj, D. and K. De Witte (2019), “The effect of modular education on school dropout”, British Educational Research Journal, Vol. 46/1, pp. 92-121, https://doi.org/10.1002/berj.3569.
[52] McCoy, S., A. Quail and E. Smyth (2014), “The effects of school social mix: unpacking the differences”, Irish Educational Studies, Vol. 33/3, pp. 307-330, https://doi.org/10.1080/03323315.2014.955746.
[49] McGinnity, F., M. Darmody and A. Murray (2015), Academic Achievement among Immigrant Children in Irish Primary Schools, ESRI, https://www.econstor.eu/bitstream/10419/129412/1/835970795.pdf (accessed on 7 December 2023).
[64] Millar, D. (2017), School Attendance Data from Primary and Post-Primary Schools 2016/17, https://www.tusla.ie/uploads/content/AAR_16-17.pdf (accessed on 27 November 2023).
[76] NCCA (2024), Wellbeing, https://ncca.ie/en/primary/primary-developments/wellbeing/ (accessed on 5 April 2024).
[13] NCCA (2017), Primary Assessment: Standardised Testing, National Council for Curriculum and Assessment, https://ncca.ie/media/5355/primary_standardised-testing.pdf (accessed on 27 November 2023).
[21] NCCA, UCD Dublin (n.d.), About, https://cslstudy.ie/ (accessed on 15 April 2024).
[8] Nelis, S. and L. Gilleece (2023), Ireland’s National Assessments of Mathematics and English Reading 2021: A focus on achievement in urban DEIS schools, Educational Research Centre, https://www.erc.ie/wp-content/uploads/2023/05/B23572-NAMER-DEIS-report-Online.pdf (accessed on 27 May 2024).
[22] Nelis, S. et al. (2021), Beyond achievement: home, school and wellbeing findings from PISA 2018 for students in DEIS and non-DEIS schools, Educational Research Centre.
[19] Nolan, A. and E. Smyth (2021), Risk and protective factors for mental health and wellbeing in childhood and adolescence, ESRI, https://doi.org/10.26504/rs120.
[1] OECD (2024), Public governance: Monitoring and evaluation, https://www.oecd.org/governance/budgeting/monitoring-and-evaluation/ (accessed on 15 April 2024).
[94] OECD (2024), Together for Children and Young People in Ireland: Towards a New Governance Framework, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/12f4dfb2-en.
[3] OECD (2023), Equity and Inclusion in Education: Finding Strength through Diversity, OECD Publishing, Paris, https://doi.org/10.1787/e9072e21-en.
[24] OECD (2023), PISA 2022 Results (Volume I): The State of Learning and Equity in Education, PISA, OECD Publishing, Paris, https://doi.org/10.1787/53f23881-en.
[25] OECD (2023), PISA 2022 Results (Volume II): Learning During – and From – Disruption, PISA, OECD Publishing, Paris, https://doi.org/10.1787/a97db61c-en.
[126] OECD (2023), Shaping Digital Education: Enabling Factors for Quality, Equity and Efficiency, OECD Publishing, Paris, https://doi.org/10.1787/bac4dc9f-en.
[26] OECD (2020), “Education Policy Outlook in Ireland”, OECD Education Policy Perspectives, No. 18, OECD Publishing, Paris, https://doi.org/10.1787/978e377b-en.
[71] OECD (2015), Education Policy Outlook 2015: Making Reforms Happen, OECD Publishing, Paris, https://doi.org/10.1787/9789264225442-en.
[2] OECD (2013), Synergies for Better Learning: An International Perspective on Evaluation and Assessment, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://doi.org/10.1787/9789264190658-en.
[107] OECD (2008), “Objectives and Use of Value-Added Modelling”, in Measuring Improvements in Learning Outcomes: Best Practices to Assess the Value-Added of Schools, OECD Publishing, Paris, https://doi.org/10.1787/9789264050259-2-en.
[108] Ofsted (2023), School inspection data summary report (IDSR) guide, https://www.gov.uk/guidance/school-inspection-data-summary-report-idsr-guide (accessed on 27 November 2023).
[121] Paulsen, J. and A. McCormick (2020), “Reassessing Disparities in Online Learner Student Engagement in Higher Education”, Educational Researcher, Vol. 49/1, pp. 20-29, https://doi.org/10.3102/0013189x19898690.
[93] PDST (n.d.), Leading DEIS Planning: A Data & Research-Informed Approach, https://www.pdst.ie/post-primary/leadership/data-informed-deis-planning-leadership (accessed on 7 February 2024).
[33] Reay, D. (2022), “Lessons from abroad: how can we achieve a socially just educational system?”, Irish Educational Studies, Vol. 41/3, pp. 425-440, https://doi.org/10.1080/03323315.2022.2085766.
[104] Research Centre for Education and the Labour Market (n.d.), The Netherlands Cohort Study on Education, https://www.roa.nl/research/research-projects/netherlands-cohort-study-education (accessed on 22 November 2023).
[109] Rubin, D., E. Stuart and E. Zanutto (2004), “A Potential Outcomes View of Value-Added Assessment in Education”, Journal of Educational and Behavioral Statistics, Vol. 29/1, pp. 103-116, https://doi.org/10.3102/10769986029001103.
[40] Ryan, S. (1994), Home-School-Community Liaison Scheme: Final Evaluation Report, http://www.erc.ie/documents/hsclfinal94.pdf (accessed on 15 April 2024).
[124] Schildkamp, K. et al. (2019), “How school leaders can build effective data teams: Five building blocks for a new wave of data-informed decision making”, Journal of Educational Change, Vol. 20/3, pp. 283-325, https://doi.org/10.1007/s10833-019-09345-3.
[77] Shiel, G. et al. (2022), Reading literacy in Ireland in PISA 2018: Performance, policy and practice, Educational Research Centre.
[85] Simon, P. and V. Piché (2012), “Accounting for ethnic and racial diversity: the challenge of enumeration”, Ethnic and Racial Studies, Vol. 35/8, pp. 1357-1365, https://doi.org/10.1080/01419870.2011.634508.
[69] Skerritt, C. and M. Salokangas (2019), “Patterns and paths towards privatisation in Ireland”, Journal of Educational Administration and History, Vol. 52/1, pp. 84-99, https://doi.org/10.1080/00220620.2019.1689104.
[112] Sloane, F., J. Oloff-Lewis and S. Kim (2013), “Value-added models of teacher and school effectiveness in Ireland: wise or otherwise?”, Irish Educational Studies, Vol. 32/1, pp. 37-67, https://doi.org/10.1080/03323315.2013.773233.
[37] Smyth, E. et al. (2016), Review of the Droichead Teacher Induction Pilot Programme, Economic and Social Research Institute.
[56] Smyth, E., S. McCoy and G. Kingston (2015), Learning from the Evaluation of DEIS, ESRI.
[20] Smyth, E. et al. (2023), Growing Up in Ireland: Key findings: Cohort ’08 at 13 years old, https://www.esri.ie/publications/growing-up-in-ireland-key-findings-cohort-08-at-13-years-old (accessed on 23 April 2024).
[105] Statistics Sweden (n.d.), Mikrodata från individregister [Microdata from individual registers], https://www.scb.se/vara-tjanster/bestall-data-och-statistik/bestalla-mikrodata/vilka-mikrodata-finns/individregister/ (accessed on 22 November 2023).
[44] Teaching Council (n.d.), Research, https://www.teachingcouncil.ie/professional-learning/research/ (accessed on 5 April 2024).
[45] Teaching Council (n.d.), Research Bursary Scheme (RBS), https://www.teachingcouncil.ie/professional-learning/research/research-support-framework/research-bursary-scheme-rbs/ (accessed on 5 April 2024).
[100] Tickner, N. (2017), Interesting Facts – First Look at Data from POD, 2016/2017, Department of Education and Skills, https://www.gov.ie/pdf/?file=https://assets.gov.ie/27570/598a9372f7484543978a1949edf1661e.pdf#page=1 (accessed on 24 April 2024).
[58] Tusla (2023), School Attendance Data Primary and Post-Primary Schools And Student Absence Reports Primary and Post-Primary Schools 2019-2022, https://www.tusla.ie/uploads/content/AAR_SAR_2019_22.pdf (accessed on 27 November 2023).
[127] Vincent-Lancrin, S., C. Cobo Romaní and F. Reimers (eds.) (2022), How Learning Continued during the COVID-19 Pandemic: Global Lessons from Initiatives to Support Learners and Teachers, OECD Publishing, Paris, https://doi.org/10.1787/bbeca162-en.
[66] Weir, S. and P. Archer (2011), A report on the first phase of the evaluation of DEIS: Report to the Department of Education and Skills, Educational Research Centre.
[92] Weir, S. and P. Archer (2005), “A Review of Procedures to Select Schools for Support to Deal with Educational Disadvantage”, The Irish Journal of Education, Vol. xxxvi, pp. 63-85, https://www.jstor.org/stable/30077504 (accessed on 5 April 2024).
[87] Weir, S. and S. Denner (2013), The evaluation of the School Support Programme under DEIS: changes in pupil achievement in urban primary schools between 2007 and 2013, Educational Research Centre.
[96] Weir, S., D. Errity and L. McAvinue (2015), “Factors Associated with Educational Disadvantage in Rural and Urban Areas”, The Irish Journal of Education, Vol. xl, pp. 94-110.
[15] Weir, S. and L. Kavanagh (2018), The evaluation of DEIS at post-primary level: closing the achievement and attainment gaps, Educational Research Centre.
[57] Weir, S. et al. (2017), Addressing educational disadvantage. A review of evidence from the international literature and of strategy in Ireland: An update since 2005, Educational Research Centre.
[35] Weir, S. et al. (2018), Partnership in DEIS schools: A survey of Home-School-Community Liaison coordinators in primary and post-primary schools in Ireland, Educational Research Centre.
[95] Weir, S. and L. McAvinue (2013), The Achievements and Characteristics of Pupils Attending Rural Schools Participating in DEIS, Educational Research Centre, https://www.erc.ie/documents/rural_report2013.pdf (accessed on 24 April 2024).
[62] Weir, S. et al. (2014), A Report on the Evaluation of DEIS at Second Level, Educational Research Centre, https://www.erc.ie/documents/deisevaluation_secondlevel_report2014.pdf (accessed on 27 November 2023).
Notes
← 1. Categories align with those established by the Central Statistics Office (CSO) and derive from the Census of Population since 2011 (Department of Education, 2024[14]). The categories encompass White Irish, Irish Traveller, Roma, Black or Black Irish (African or other Black background), Asian or Asian Irish (Chinese or other Asian background), and an "Other" category, including mixed backgrounds.
← 2. HP Index data are based on students across all grades in the participating schools. In contrast, measures from PISA were derived only from those students participating in PISA, i.e. those aged 15 years at the time of assessment. The authors acknowledge that the lack of data on socio-economic background (PISA index of economic, social and cultural status) for students across all grades is one limitation of this measure.